Over the last 10 days I’ve been at Petnica International, a science seminar in Serbia, and it was an incredible experience (not unlike other Petnica programs) — I’ve met lots of cool and interesting people from all over Europe, learned some new physics and computer science — but this post is mainly about my project there.
The project concerned optimizing shapes of glasses, in order to lower the average temperature of the contained liquid during drinking. The original idea is sourced from here. My results include a lot of noise, but I’d still like to go over the solution since it was an interesting project to work on. (Most of this post is from the paper I wrote at Petnica.)
As I’ve mentioned in my introductory post, the first part of the monac (Mona compiler) I’ve built is concerned with lexical analysis. In this post I’ll explain what lexical analysis is, what part it plays in the overall compilation process, and how I’ve gone about implementing it in monac (which is programmed in Scala). Since my primary motivation for the entire Mona project is educational, I had decided to build my own lexer (lexical analyser) from scratch. The main body of work involved creating my own regular expression engine, which is at the heart of the lexer, and other than that, the scanning mechanism (which reads individual characters and handles whitespace/comments), and the caching mechanism for some data structures.
I’m working on a new project, a programming language named Mona, which integrates a functional programming style with systems programming and low-level features. My primary goal with such a combination is to enable as wide an abstraction spectrum as possible, making it suitable for a wide array of applications. This is similar to what Bjarne Stroustrup wanted to achieve with C++ — a language that can be used to program complex systems which require both precise control of resources and abstraction facilities, except that instead of choosing OOP as the main abstraction paradigm, I have chosen FP (and some closely related ideas).
In image processing, a lot of useful information in an image is contained in the edges of objects, and edge detection is a process of separating the object edges and the rest of the image (objects themselves and their environments). Obtained information can then be used for further analysis, such as detection of faces, boundaries, or other image processing procedures such as superresolution (e.g. deciding whether to continue a gradient or place a sharper border between new pixels could depend on whether an edge is being processed).
Robot Evolution is an application that uses genetic algorithms to evolve and optimize virtual walking poly-pedal robots. The robots are 2D geometric constructions of rectangles that are connected by virtual motors which apply torque to these rectangles, making them move.
I’m working as an intern at the moment, and I was tasked with creating an evaluator for Java for an online competition. It’s essentially supposed to execute code sent by the competitors using JUnit (and a security framework, of course), and return some results – the final implementation is supposed to be a web application which would allow users to submit JAR files with their classes to the server and display a ranking and/or their individual points and test results.
The field of artificial intelligence, abbreviated as AI, has been quite turbulent and ever-changing in its scope since its creation. This essay deals with its inception, transition from idealistic to pragmatic, history, development, crises, touches the current state of the art, and concludes with a thought on where the future of AI research lies.
I’ve created two infographics with the intent of helping newbies with choosing the right GNU/Linux distribution. The content is practically the same in both, but the second one has some graphical improvements that aren’t my own work.