In our second MV Journal Follow-up, we will review a few topics that stopped at a cliff-hanger…or something similar, I guess, but if someone asks you what the best thing you got from this post was, you can point the person to McBroken. If you’re cruising around the USA, never forget to check if the nearest MacDonalds has a working McFlurry machine. Amazed? The internet produces great things, right? Unless you’re a developer mandated to use a Cloud-based IDE. In that case, I’m not sure we’re going in the right direction, but the verdict is still out here on this one. Let’s have a few words if you get to try one.
The Resource Balance Loop - Remote Development IDEs are back again
In the olden days of mainframes, from the mid 1970’s to the mid 1980’s, most people used real text-terminals to communicate with large computers. These real text-terminals were neither computers nor emulated text-terminals. They consisted only of a screen, keyboard, and only enough memory to store a screenfull or so of text (a few kilobytes). Users typed in programs, ran programs, wrote documents, issued printing commands, etc. A cable connected the terminal to the computer (often indirectly). It was called a terminal since it was located at the terminal end of this cable. Some text-terminals were called “graphic” but the resolution was poor and the speed slow by today’s standards due to the high cost of memory and the limited speed of the conventional serial port, etc.
Text excerpt from linux.die.net
A decade ago ( probably more), I was again at the doctor’s office with yet another case of tonsillitis. The procedure was always the same. A throat inspection to assess the situation and few questions to understand if I had to go with an antibiotics prescription. This time I noticed some differences at the doctor’s desk. A very slim LCD with an inconspicuous “Siemens” sticker, keyboard and mouse. No visible tower. After a few seconds, the doctor apologises for the time it took to just get a prescription. She says, “This is only a monitor. They told me that the computer is at the Hospital.” The comment wasn’t completely random since she started the consultation with a few questions, including my occupation.
It made perfect sense at the time. A closed system for hospitals where all resident activities at the desktop were well-defined. A mainframe alike setup with some slim hosts would be enough for day to day work. The application layer would be a different problem altogether, but the concentration of resources at the Hospital data centre would suffice. Back at the office, after recovering from my infirmities, I’ve shared the setup with my manager, and he pointed me out that we were going to have a similar architecture for call centres with “dumb hosts” replacing existing desktops. “Back to the past”, he said.
In the last decades, we’ve seen a back and forth with resources shifting from computation aggregates, whether data centres or mainframe setups, to user desktops and back again. Reasons vary from security concerns to monetary restrictions, going through systems maintenance and upgrades. For me, it was never clear, and my preference would always fall into capable desktops with good intranet or internet connectivity but primarily for convenience and ignoring all the other corporate aspects that the C suite needs to weigh in.
In the development world, even with the advent of large scale data processing with Hadoop and all the tools derived from the Map-Reduce revolution, we still saw Engineers and Programmers wanting powerful machines that can run all the tools simultaneously and mimic production environments. Rarely I’ve seen developers working in remote environments and liking the experience. You might need a database running outside your local environment or to interact with a development message queue, but if the developer can contain the technology zoo in a local pen, they will.
With the Cloud eating Software everywhere, running complex environments locally became almost impossible. Most of the tools aren’t readily available to install locally, and some require too many resources, even for powerful machines. Take, for instance, the very simple SQS from AWS, a message queue only available in the Cloud or Forecast, an AI system design for resource demand forecast also from AWS. Are you guessing where this is going? “Back to the past”, I say.
August 11, Github announces that they are migrating all internal development to Codespaces. Mr Cory Wilkerson is a Senior Director of Engineering at GitHub. He paints a rosy picture in his blog post announcing the shift to the cloud work environment, but it takes some development iterations to get the feel of such a change. I’m sure that we’ll have an MV Journal Follow-up on this one.
Cloud9 is the development environment from AWS. A quick visit to the product website features a “Benefits” header followed by “Code with just a browser”. At first glance, I get scared with such a statement. Whether at Codespaces or Cloud9, the Browser brings a set of hidden moving parts on top of an insecurity layer. It isn’t a benefit per se, although I’m willing to accept that it is a ubiquitous piece of Software with a high level of accessibility, that is it. From a developer standpoint, showing a few code shops at your parent’s computer might be amusing, but doing it professionally requires a sensible setup that goes well beyond the Browser. Check your chair and desk, for instance.
Microsoft Azure Notebooks is another type of Cloud IDE that packages Jupyter Notebooks, the fundamental technology of this product. Jupyter is a non-profit open-source project that provides a Python development environment that primarily targeted data scientists and machine learning enthusiasts. It gives an integrated environment with many pre-installed libraries tools for data processing and graphics display. Nowadays, the project encompasses more programming languages, but the true benefit of this environment lies in its hybrid approach. You can develop your code remotely via your Browser or download the Jupyter Notebook onto your machine and extract all the power from your local machine instead of paying for Azure computation.
Finally, I would like to mention Observable co-founded by the visualisation guru Mike Bostok. The objective of Observable is to provide a development environment to present data in the best way possible. You can see an Observable Notebook as a refinement of a Jupyter Notebook made purely for data presentation, although it can also take considerable amounts of computation. I could see such an environment feasible for data scientists that focus primarily on data interpretation and presentation, falling into my tonsilitis diagnosing doctor type of work. They also provide extension points using third-party packages as long as they are supplied via PyPi, but the breath of visualisation libraries already available will probably deter you from taking the hard path.
You’ll find many flavours of online editors, and I bet that more will show up from companies like Jetbrains. Still, I’m not old enough to see the dumb terminal approach as a reliable way of doing development work, at least in heterogeneous environments where Engineers have a wide array of skills and don’t fall on narrow domains such as front-end development or data presentation. Nevertheless, I will check Cloud9 for some microservice work at Metric and maybe get converted to the Browser way…but don’t bet on it :)
MV Journal Follow-up
May 20, a couple of weeks after my piece about Basecamp, Mr David Heinemeier, or DHH as he’s known online, published an article with the title After the storm. Even now, it’s too soon for such a sentiment, but Mr DHH reports that after a minor blip in departing customers, business bounced back and kept growing as expected. One of my previous managers usually said, “The cemetery is home for many irreplaceable heroes”, and this sentiment is shared in Mr DHH words regarding departed employees. Life goes on at Basecamp with new employees and customers huddling around the fire of a successful business.
It escalated. Epic is pushing buttons all over the world, testing law limits but still waiting for the jury. Australia was the last country to accept Epic’s claims in court, and another legal battle will pit Epic against Apple in this World Market Wars. The strategy looks clear enough for us, mortal bystanders. A good victory in any country might trigger a cascade in the remaining courts and/or push Apple to make changes globally instead of managing each country market as a vertical. In between, Apple is losing ground with lawsuits eroding its claims and forcing changes even before Epic enters the ring. A class-action suit from 2019 ended in August, with Apple conceding to make changes that allow app developers to redirect users to alternative payment options. This is a step toward an Epic win, but the amusing part lies on the judge. Miss Yvonne Gonzalez-Rogers is the Californian judge overseeing the class-action suit settlement, but she also presides Epic vs Apple. Far from California, South Korea dealt a significant blow to every digital Market, and it will make a profit dent on big players such as Apple. The bill ( I recommend the “translate” option of your favourite Browser to take a peek) establishes that digital market makers need to allow third-party payment systems. Not only that but also added provisions to avoid delays in application approvals and blocks that might be used to deter companies from going outside the app store. Apple and Google are indisputable giant whales with their digital markets, but it looks like the law takes the underdog side. Let’s wait again for a few months.
First, I’ll start declaring that Copilot is a failure, at least for now.
Mr Brendan Dolan-Gavitt is an Assistant Professor at NYU Tandon. He noticed some strange configurations in Github’ Copilot Visual Studio Code extension to detect less desirable words. You can follow his tweet sequence about this digital adventure here, but one of his findings was very entertaining. The extension uses a black list of words to censor input, and within that set, one can find “q rsqrt”. If you’re not familiar with the string, I’ll help you. Originally it is Q_rqrt, the Quake fast inverse function that Copilot regurgitated as an auto-complete option. Mr Brendan tweets reveal much more, but this extension is just one more example of how not to do it. But it’s not all. A recent paper suggests that 40 per cent of Copilot’s suggestions are erroneous from a security point of view. This shouldn’t be a shock considering my article. Copilot’s training codebase was feed as is. Who would have guessed that it would drop such foul code? I’m keen on loonshots and hope that Copilot becomes a successful tool to help development in many ways, but right now, I defer to my first sentence.
Well, this is a tangent to the original article, but it shows that anyone can be impacted by repair restrictions, even the fast-food giant MacDonalds. At least in the United States, you’ll need some good alignment from the ice cream gods to get a taste of a Macflurry. The failing ice cream machines are a running joke on the internet, and even MacDonalds took the internet’s side with the tweet:
Usually, when I went for a surfing bout, I visited a beach webcam site to see if the surf was actually up. If you want a Macflurry, you better pay a visit to McBroken and see if the local machine is working. Not having a Macflurry whenever you want to visit a MacDonalds has become a severe issue for the USA’s Federal Trade Committee or FTC. Funny aside, the FTC is probing MacDonalds since it seems that HQ doesn’t allow franchisees to fix their own machines, which leads to long wait times until customers can get their tasty treat. I’m on FTC’s side on this one for sure.