This post describes the thought process I went through and technical decisions I made during the development of APL CodeGolf. The whole development started with an idea my colleague floated around a month before the conference. Sam Gutsell suggested it might be a good idea to run a code golf competition in which delegates could compete to get the tersest solutions to a set of problems defined. See the definition of code golfing here. In a team meeting we discussed how we might run such a competition and a web application was deemed the most fitting solution, even if the time to develop it was incredibly tight. All in all I had 3 weeks worth of development time to build an unprecedented application from front-end to back-end (YIKES!).
Beginning the Stack
The first thing I needed to decide on was the technical stack to use for APL codegolf. Since the project required a very fast turn-around I wanted to use tools I was very comfortable in and allowed easy set-up and configuration. The tools were very much determined by how I modularised the project. Three main modules were to be developed: the site, an API server, and a server in which I can run people’s solutions against test cases. Of course the project also needed a database so I went for a self-hosted MongoDB instance.
The front-end was the first module I started developing. Often I like to do it this way so I can develop some components I can later use to test the connection from the front-end to the API server at least. I’m currently a big fan of using React and Redux as front-end frameworks; React provides a very fast and responsive view layer and Redux gives me a single predictable state that’s easy to debug and test. Along with a couple of plug-in libraries including Bootstrap, the front-end micro stack was complete. Initial development was slightly slow as Redux requires a fair quantity of code to set up and prime for things such as asynchronous HTTP requests. React in comparison takes almost no time to set up (provided you use the create-react-app tool) and once the foundation Redux code is written then joining the two is very intuitive. After writing some preliminary code for both React and Redux it was time to create a couple of testing elements. The most essential nut to crack was authentication, so I created a login/sign-up form and decided to switch to the API server development.
Apart from Mongoose I only needed two other pieces of middle-ware to complete the API server: Passport.js and Morgan. Passport handled the authentication and Morgan handled the request logging which proves invaluable when debugging. A session-less authentication system fit the bill and I settled on an API key mechanism for initial authentication and the posting of code solutions. Essentially, when a user signs up I compute a unique hash for them and store it in the database, in subsequent requests the client sends their hash as a header in the HTTP request and I’m able to compare and match it with a record in the database. Attaching the mechanisms I just mentioned to routes meant I had finally reached a stage where I could test FE => API connection. With a few minor tweaks the authentication system was up and running! Oh and by the way I use Postman to do intermediate testing and check I’m getting the results I’m after.
Linking API to MongoDB
The DB structure I’d dreamt up for storing problems and user submitted solutions had changed a few times in the period I’ve described up until now. Putting some components in to allow the submission of solutions seemed like a sensible next step towards solidifying the DB structure and adding more components for testing connections. The solution input form was born and I added the necessary code to link it to the API. Now being able to receive and store solutions is great but I needed to present a dynamic problem set to the user in order for them to know what needs solving! Creating a “problems” collection and populating it with documents containing problem descriptions and test cases was a start. Then, linking API routes in Express to fetch the data stored in MongoDB left two steps: calling the API route from the FE, and displaying the problem set. Calling API routes was nothing new and re-using some of the Redux action code I previously wrote meant I soon had a HTTP GET request returning the problems in Mongo in a nice JSON format ready for React consumption. Displaying the problems was a new challenge; a tab control seemed fitting so that was introduced and the solution form was modified to appear on each problem tab.
The stack was starting to look very healthy and by now I had the front-end linked to the API server which could serve requests for both record retrieval and creation in the relevant collections. The last major part of the technical stack needed to be started; the last part being the solution validator. This was to be written in APL so as to allow easy parsing of the user submitted code and evaluation in the Dyalog APL interpreter. I had it in my head that the API server could make HTTP calls to an APL based server which would evaluate the solutions and return an appropriate response. This is exactly how I ended up doing it. I fetched a utility called Rumba which allowed me to throw up an APL based HTTP server. To handle requests I needed to do four things:
- Check the solution is not harmful to the application / environment and conforms to the rules;
- Fetch the relevant test cases for the problem from MongoDB;
- Validate the solution against the problems test cases;
- Return a result indicating whether or not the solution passed and if not, what test cases it failed exactly.
It didn’t take much code to get the above working and the small Dyalog APL server was ready to go. The API server is able to communicate with it by making local HTTP calls to a specific port. FINALLY, all parts of the stack have been started and are able to communicate with each other. The journey of
request => response went something like this:
Client =request=> API Server =request=> APL Server =response=> API Server =response=> Client
Getting Ready for Production
With all the major hurdles out of the way I could focus on some aspects that had been neglected in earlier stages. Things like the styling of the site and code re-factoring. Eventually after a couple of bug squashing bursts the site was ready for it’s unveiling. I configured Apache with the necessary settings and fetched all of the SSL certificates from LetsEncrypt. With the domains live, I used the React App Build tool to produce my production code and cemented the API and solution validator servers in so they were robust and reliable services.
And there you have it, APL codegolf! https://apl.codegolf.co.uk
APL codegolf was a success at the conference and is currently hosting another course of holes for a seperate competition.
In general I was happy with my choice of stack and tools. Separating the stack out so much allowed me to develop in a modular fashion and test each component in isolation. The code base suffered as a result of time pressure and some inconsistencies appeared but they were easily ironed out.
After the current competition I plan to make the site’s source code available. I look forward to collaboration and welcome any feedback. Whilst I was using familiar tools this project did take me to areas I hadn’t previously explored and it was useful as a learning exercise also.
Thank you for reading I hope you enjoyed the article.
Former APL Developer
Knowing a software and games developer from a young age meant Callum was always intrigued by computer languages. From writing small applications and games in free time he progressed through sixth form and is now a Software Developer.
- A Trip to the FinnAPL User Meeting
- Server Support for Businesses
- Backups & Disaster Recovery Solutions
- Our experience moving to Microsoft Office 365
- Dyalog Glasgow Round Up
- Indexing arrays in APL: Squad indexing on Multi-dimensional Arrays
- The Power of APL
- Nudge Theory – What drums do you dance to?
- Meltdown and Spectre
- Sliding Down Memory Lane