Cross-platform software is the norm today. In many cases, developers don’t have to consider what operating system their apps will be deployed on, because apps tend to be more or less platform-agnostic.
So, should programmers still specialize in one OS or another? Or does the developer operating system no longer matter much from a programmer’s standpoint? The short answer is that, by and large, it won’t matter which OS developers use. But there is one big exception, though, and that’s mobile development.
Operating systems for developers
If you compare traditional OSes, the differences shouldn’t be that significant for developers.
We deploy most apps in the cloud now, where you can choose to host them on whichever developer operating system you want — well, maybe not on macOS, but certainly Windows or Linux. And, even if you deploy your application locally, virtual machines (VMs) make it easy to set up whichever type of OS environment you need.
Cross-platform portability is an explicit goal for most popular programming languages today, such as C, Java and Python. C was born in the early 1970s as a way to make Unix portable across different hardware platforms. The Java virtual machine greatly simplified cross-OS portability. And Python applications can run on virtually any OS.
Modern programming languages still aren’t entirely OS-agnostic, of course. Developers often have to address OS-specific dependencies when they write an application, and the installation process for most applications differs from one OS to the next.
Still, by and large, the modern programmer doesn’t have to think about the differences between various developer operating systems nearly as much as she did a decade ago. In some cases, you can drag and drop the same application from one OS to another without requiring any configuration changes at all.
Programmers should no longer specialize in one platform or another. You don’t have to be a Linux programmer or a Windows programmer. You just have to be a programmer.
Mobile developer operating systems
All of the above is true if you work with traditional PC-based or web-based applications. If you program mobile apps, the developer operating system story can be quite different.
In a typical Android app development process, programmers have to:
- write their app in Java, the most common language for Android;
- use an IDE, such as Android Studio, or a more generic, nonoptimized IDE, such as Eclipse;
- test their app on a wide variety of hardware, browser and Android OS version combinations, because the Android universe is so large; and
- publish the app to the Play Store.
In the iOS development process, programmers need to:
- write their app in Swift or Objective-C;
- use Xcode as the IDE;
- test their app for a relatively small list of environments, given that the iOS ecosystem is not nearly as large and diverse as the Android one; and
- publish the app to Apple’s App Store and wait for approval.
In the mobile world, it’s much harder to be an all-purpose programmer. Most developers will find it easier to specialize in either Android or iOS.
The future promises more fragmentation
It’s worth noting that fragmentation will increase within the developer community as programming for IoT and other nontraditional platforms becomes more common. IoT is also a highly fragmented and diverse ecosystem. The skills required to write IoT applications can vary widely depending on which niche you’re writing for.
Programmers today live in a world of extremes. If you develop traditional PC or web apps, developer operating systems don’t matter much at all. You can focus on writing your app without much worry about the development and deployment nuances of Windows vs. Linux vs. whatever else.
When it comes to mobile app development — and, even more so, IoT programming — the differences between OS platforms are extreme, and programmers need to be prepared.