October 18, 2021

Experiences in programming the cross-platform user interface, and why low code/no code is on the rise

What this article covers:

  • Why the user interface (UI) matters.
  • The transition from a world with a dominant operating system (OS) and when UI development was for the desktop computer, to a fragmented world of multiple OSs, and UIs for desktop, laptop, web, tablet, smartphone and more.
  • On the way, why Java Applets and browser engines like Adobe Flash dropped out.
  • How improving user experience (UX) was a prime driver for change.
  • Why low code/no code (LCNC) is on the rise.
  • Level of expertise: A nostalgia trip for oldies and a historical guide for newbies.

This is a brief account of UI development within the broader topic of software development and reflects my personal journey to build a chess variant application for fun. The UI matters a lot because it dominates the code in most professional/commercial applications. Typically, the code that controls how you interact with your application takes up most of the program, it is often called ‘plumbing’ and most programmers re-write this part repeatedly in every application, ideally with only small changes, whereas the part of the program concerned with the core application may be just 10% to 20% of the whole code. The UI is not the only plumbing code, there is how you manage data, store it, and access it. But data management is less affected by the fragmentation that exists today in the UI.

Picking up the story just before the internet (pre-1994), it was quite simple then, Microsoft and its Windows OS dominated, and the user interface was for the PC. The world wide web brought in a major change in UI development. Post 1994, and the rise of the internet and the web, UIs had to contend with browsers and connecting to applications that were running on a remote server in the data center. Web applications had to deal with being offline and synchronizing when online. There were also the tricks used to maintain a web connection between a client browser and a server, so that the server could push out data to the client. This had to do with how web connections worked back then – they were stateless and there was no memory between one request and the next. These tricks came under the label ‘rich internet applications’ (RIA), e.g. comet techniques which hold an HTTP request so a web server can push out data without the browser requesting it. Today, RIA features are built into the web application programming interface (API), from HTML5 onward (after HTML5 the web standard just became HTML with continual updates and no version numbers), with technology like WebSocket which creates a two-way communication channel between browser and server.

Aside: In 1998 I came across an unusual UI using Crystal Reports, a business reports solution. A major European buildings construction company created its custom ERP system with Crystal as the front end UI. As Crystal supported embedded code, it was effectively a low code solution for the UI, but not a natural UI choice. There were hundreds of reports that doubled as UI, and maintenance was a massive undertaking involving many tens of developers.

My interest and frustration with the UI came with my pastime desire to code a chess variant with a multi-player interface. I programmed in Java and hence the UI came in the form of Java applets that could run on any browser. The problem was that soon after I developed this application, running applets was considered a security risk and they needed a security certificate, and then they were banned altogether. Back to the drawing board for me.

Then in 2007 Apple launched the first smartphone, the iPhone, and gave developers another UI to build for iPhone apps. In the wake of Apple’s success sprang a smartphone industry of copycats and we all watched with keen interest the fragmented mobile OS wars. Eventually out of that war emerged two winners: Apple’s iOS and Google’s Android.

By this point UI development could be done relatively painlessly if you stuck to one platform (i.e., one OS and device form factor) but with the fragmentation of platforms wishing to cover more than one meant re-writing your application. Writing once and deploying multiple times became desirable if you wanted an application with the broadest reach. Enter the RIA evolution (circa 2010) that led to cross browser/cross platform UI engines targeting desktop and mobile, and there were three main contenders: Adobe Flash, Microsoft Silverlight, and Oracle JavaFX. However, Apple wanted a closed shop, and was against supporting any cross-platform engine. Security had a part to play in that policy, Flash was continually being updated with security patches. Oracle, the new owner of Java, was internally divided on whether to grow JavaFX or not, and eventually decided not, giving it over to an open source community where it continues to have a life today.

Circa 2016-2017, I had not appreciated how powerful Apple’s position was, as the mobile space was still fragmented. After considering the options, I decided my next step with my chess variant was to opt for Adobe Flash. Many months passed and I decided to act and purchased a batch of books on programming Flash without checking on the state of the market. A few weeks later, Adobe announced it was abandoning Flash. The mobile wars were over, and Apple was a winner. It was too much for Adobe, hence its decision. Microsoft also abandoned its cross-platform engine. And back to the chess drawing board for me.

Today we have JavaScript as the dominant browser UI scripting language with technology options such as node.js and Angular and others too many to mention. There is a separation today between programming core applications and programming the UI as each has its own set of technologies. This fragmentation has also played nicely for the LCNC players. In the early days, these tools were known as model-based development and graphical UI (GUI) drag and drop solutions. LCNC may well be the answer for many application developers who are more interested in the core application and want the UI plumbing automated. It will release a pent-up demand to build applications which line of business departments desire and which many central IT departments often have no capacity to satisfy. LCNC today can play the role of a cross-platform UI builder, but solutions vary as to which platforms are supported.

Throughout this history of UI, while the technology has fragmented across multiple channels and become separate from core application development, the end user has benefitted. What has driven this UI history is the desire for better UX. UX is what the end user experiences when interacting with technology, and the ideal of course is that the technology just melts into the background, the end user does not notice it. UX has been the prime driver of change: the web browser gained adoption because it made navigating the internet easier, and the iPhone revolution was all about UX, it expanded the mobile phone into a handheld computer running multiple apps in an easy intuitive way. These technology waves made the UX better. At the same time, they created new barriers for any programmer wanting to create a cross-channel UI-rich application. So, it is no surprise to me to see the rise of LCNC, taking the burden out of cross-platform UI development is a great opportunity, I think this sector of appdev will continue to grow.

Michael Azoff, Consulting Analyst

23 June 2021

2 Comments

Leave a Reply to Working at Walmart Cancel reply

Your email address will not be published. Required fields are marked *