The Industrial Internet of Things, or IIoT, marks a phase shift in management of physical productivity. It’s sometimes referred to as the fourth industrial revolution, or Industry 4.0 (in nomenclature that clearly owes much to web terminology).
Software launching requires strategy, but strategy alone is not enough. There must also be a system to match up strategy with day-to-day activity, managing development and launch as a project. This must be sufficiently adaptable as to function as a constantly-revised, living document, not a series of instructions that cease to be relevant as products evolve and plans change.
Minimum Viable Product, or MVP, is a way of launching a product without having to build every detail. Instead, the smallest (minimum) version of the product that will still work (viable) is built and launched, to provide the basis for future iteration and development. Customer feedback, business metrics and customer usage data are used to refine, improve or redesign, and extend the product, and the MVP approach can be used for new features as well as new products.
The process of taking an idea for a new tool or feature, or a solution to an existing problem, and turning it into something people actually want to use, is not always simple or easy. It’s easy to make mistakes in planning. So we need prototyping to test ideas. The trouble with prototypes is that they involve a trade-off between how useful they are and how costly and time-consuming they are, which means that in the past they’d often be used to validate designs rather than invalidate and iterate on them.
End users don’t care much about the technologies we use or the methodologies we rely on for development. But they do care about their experiences. Everyone knows this, and when it’s time to market products and services, we remember. But when we’re designing solutions, whether they’re consumer-oriented or B2B, it’s often forgotten.
The Internet of Things (IoT) goes a long way beyond consumer-level technology like smart toasters and doorbells. It’s being adopted across manufacturing, warehousing, supply chains and logistics worldwide. Good results often follow: 80% of IoT projects achieve better-than-expected results, according to Gartner. The same survey found that the involvement of the CIO was crucial to the success of IoT projects, not surprisingly.
Some businesses are moving data from legacy systems to the cloud. Some are moving from on-site servers to other on-site servers, or between cloud implementations. It’s sometimes necessary to migrate data between applications, as business processes change and stacks change with them. One thing is certain: most businesses will do a data migration. When they do, it will be fraught with risks — both as it’s done, and afterward when you try to run your company on the results.
Cloud computing is rapidly becoming ubiquitous. The change is driven by a spread of unarguable advantages, ranging from speed through security to access and user experience. Underlying these benefits are the facts of cloud computing: remote hosting and as-a-service implementations of everything from software to infrastructure means you hire out tasks to experts, instead of trying to build an in-house team that can compete with Salesforce and Microsoft and Amazon Web Services.
The hiring process for software engineers is complex and time-consuming. At the end of it, your company has a new hire with high salary expectations who might take several months to reach full productivity. You could spend many hours and thousands of dollars for each addition to your team, only to find they’re a bad fit — or even that they want to move roles again.
Roadmaps are crucial to the success of digital transformation projects. But they’re misunderstood, and often misapplied. In many cases, companies don’t have a roadmap at all! Where they do, it’s often not doing what it should for them.
Agile principles emphasize the use of relatively short development cycles, each of which achieves some tangible improvement to the software. Agile rapidly became the dominant approach to software development because using it means you catch bugs early, before you built too much on top of them, and because it gives you granular control over development direction. Yet, when it comes to UX, agile is underutilized. That’s because too many software development companies see UX as something separate from their main task.
Cloud migration involves moving data, applications and processes to the cloud. This can be from on-site implementations like owned servers, or from another cloud host. Moving to the cloud can be daunting, especially if you haven’t made the transition before. Even if you’re familiar with cloud computing, planning and executing a migration still requires forethought and strategic, as well as technical, consideration.
When should you outsource software and app design? For some companies, the answer is ‘never’ — but they’re pretty rare. Many companies rely to some extent on outsourcing, to get them over a hump, or to accelerate time to market without taking on the liabilities involved with a big full-time team.
Several technologies that are coming to the fore in commercial applications will change the way we develop software, and the uses to which it can be put. While some of these technologies are still at the unstable innovation stage, where it’s unclear whether they have a future or what it will involve, others are making the leap from niche uses and early adoption to the mainstream.
UX is a key consideration for any business. It’s literally how our users experience our products and services, so unsurprisingly it’s also what they tend to judge us on. It’s also the biggest single lever most companies can pull to affect revenue, profits and growth.
Digital technologies now increasingly pervade both business and everyday life. Employees and customers alike, whether business or consumers, draw their expectations from digital experiences. And techniques like big data and AI are obviously impossible without digitization.
When launching a new software product, the tendency can be to go for a ‘quick and dirty’ approach. And often, too much attention is paid to development and product design, too little to marketing. You get products that work really well, but don’t address key needs and requirements among the target market. Or you get products, sometimes high-quality ones, that launch in perfect silence and disappear the same way.
Over the past several years, I’ve tried to put my finger on why leading transformation feels a bit different than other leadership roles. In the area of transformation, leaders are often the primary change agent and don’t have others inside their organizations to turn to for advice.
When most people think of “computer software,” they think of applications that are used to interact with a PC, laptop, or mobile device. These programs display icons or information on a screen and take input from a keyboard, pointing device, touch screen, or game controller. But interactive software represents only a fraction of the software that’s out there.
Refrigeration fun fact: Electrically powered air conditioning, which relies on the principle of the refrigeration cycle, was invented not to make homes and other buildings comfortable in hot weather, but to control the temperature and humidity in a printing shop. The lithography process that was common in the early 1900s could be adversely affected by variations in temperature and humidity, so William Carrier invented a device to keep those factors stable.
Somewhat lost in our collective excitement about the end of the year 2020 – and many hoped, an end to all the unpleasantness associated with that year – was the fact that 31 December 2020 also marked the end of an important period in internet history: the Adobe Flash era.
No matter what your opinion might be of the pandemic or our collective response to it, one thing is clear: 2020 showed the business world how important it is to have a robust, agile digital transformation strategy. Companies with the flexibility to pivot – sometimes multiple times – with employees, customers, and production did better in 2020 than those with rigid systems and infrastructure.
In the classic Road Runner cartoons, you often see this when the Road Runner starts to run: His feet start moving, but the bird himself stays in place.
As if being ostracized by big-tech hosting and supporting services was not enough of a controversy for a large and rapidly growing social networking provider, allegations recently surfaced that the provider’s user data was leaked. Reports indicate that before Amazon removed the provider from its hosting service, an Austrian hacker claimed to have accessed users’ image and video files that were uploaded to the website, along with the associated metadata. Regardless of whether this information was obtained from publicly posted materials or from a hack, the issue reinforces the need to regard data security as a top application priority.
For most modern software applications, the developer doesn’t have to think much about hardware integration. Many hardware components the software comes in contact with—displays, pointing devices, keyboards, cameras, mobile device GPS receivers, etc.—have application programming interfaces (APIs) or are represented by standard interfaces built into the operating system.
Most computer mouse devices today are “optical;” they shine a light on the working surface (the desktop or a mouse pad) and use the reflected light to determine the speed and direction of the mouse’s motion. Earlier versions were mechanical, incorporating a trackball whose direction and speed of rotation was converted into electrical signals indicating the motion. (Mouse trackballs also had a habit of picking up dust and lint that interfered with the mouse’s operation. The optical mouse was a major improvement in this area.)
“We need to build a mobile app!”
“Sure, boss. What exactly should this mobile app do?”
“Doesn’t matter! All our competitors have mobile apps, so we need one too!”
This is exactly the wrong way to launch a mobile app project.
Yes, you probably do need a mobile app because an increasing number of people do a majority of their computer activities on mobile devices. And yes, most or all your competitors have mobile apps available on app stores. These are necessary but not sufficient conditions for building a mobile app.
One of the great innovations of the modern age—something even more useful than the beer-can hat—is mapping applications such as MapQuest and Google Maps. Not only can they show you the fastest way to get from A to B (accounting for traffic, construction, one-way streets, and other obstacles), but they can give you alternatives, such as those favoring or avoiding freeways or scenic routes. Some applications will even give you the best routes for traveling on foot, by bicycle, or public transportation.
Have you ever thought about what happens to a package when you turn it over to a shipping carrier for transport? Do you have a sneaking suspicion that they don’t really pay attention to markings such as “Fragile,” “Handle With Care,” and “This End Up”?
In most cases, you have no way of knowing how your packages are handled in transit. The fact that mishaps sometimes do occur is the reason why most carriers offer some form of insurance against loss or damage.
Choosing a vendor or service provider can be easy (if time-consuming) when you know what you’re looking for. If you need a contract metal fabricator to manufacture a part, you ask for work samples and perhaps ask about ISO 9000 certifications, turnaround times, or other pertinent information. For something like janitorial services, you might ask for a catalog of their services, prices, and references.
Don’t you love unexpected bonuses?
It doesn’t have to be anything big. The parking meter that still had an hour on it when you parked there. The pizza in the break room on a day you didn’t have time to pack a lunch. The dollar bill someone used as a bookmark in the library book you checked out. The canceled late-evening Zoom meeting that gave you time to read a story to your child.
Who would you rather have re-piping your home:
You probably don’t pay attention to every advertisement you see or hear. (There is so much advertising in modern, First-World life that if you did, you would never have time to do anything else.) But take a moment and consider some of the advertisements you come across.
Back in the 1960s, the term “hi-fi” (short for “high-fidelity”) was a popular buzzword. It started innocently enough as a term to describe the greater sound quality available from high-end stereophonic home audio equipment. But the term caught on in the marketing world and suddenly everything with a perceived elevated quality was known as “hi-fi.”
One of the words that came up frequently in my early digital career was “optimize”. We were always optimizing web content for keywords to improve search engine results and rankings. We optimized experiences to increase conversion of site visitors.
A few years ago, a story made the rounds confirming what many people suspected: The vast majority of thermostats on the walls in commercial buildings don’t do anything.
The future ain’t what it used to be. -Tom Petty, Spike
Consider two speculative sci-fi films, Blade Runner (1982) and 2001: A Space Odyssey (1968). Blade Runner was set in 2019 and 2001: A Space Odyssey was…2001. In both films, artificial intelligence (AI) plays a key role, and not in a good way.
An old story about corporate culture runs something like this:
A consultant gives a seminar about corporate culture and the benefits of having a positive, supportive culture that encourages collaboration, open communication, innovation, and other goodies. Afterwards, a CEO is overheard to say, in all earnestness, “I want one of those cultures, and I want it Monday morning!”
The Star Wars canon has medical droids playing a prominent role in autonomous diagnosis and treatment of injuries and illnesses. How cool would it be if even semi-autonomous systems could handle routine diagnoses and treatments, freeing doctors and caregivers to focus on more challenging cases? Well that future is a little closer to reality than some may think!
Humankind, it seems, has for a long time been fascinated with the idea of machines that can mimic the human capabilities of thought, intuition, creativity, problem-solving, and communication. From the chess-playing Mechanical Turk of the 18th century (which was actually a clever and elaborate ruse) through the ELIZA computerized “psychotherapist” in the 1960s and any number of sci-fi franchises and Saturday-morning cartoons, history has no shortage of intelligent machines, real or imagined.
Humans have a curious habit of talking to inanimate objects. As children, we talk to stuffed animals and other toys. (Sometimes, we answer for them.) This carries into adulthood; we talk to plants, cars, televisions, kitchen appliances, computers, rocks... Sometimes we chastise them when they misbehave.
In the 1975 movie Monty Python and the Holy Grail, King Arthur and his knights concoct a brilliant plan for penetrating a heavily fortified castle: Drawing on the Trojan Horse story, they build a wooden rabbit (why not?) and position it outside the castle gates as a gift.
If someone asked you to list the top 10 technologically cutting-edge, forward-thinking industries aggressively adopting digital transformation, chances are “insurance” would not be among them.
We take software prototyping seriously. Prototyping—in particular, rapid prototyping—is the best way to make sure we, as AndPlus developers, understand what our customers expect the software to do and how it will perform in supporting their business processes.
Data has become the lifeblood of any modern enterprise. Making decisions on the basis of hunches and intuition can be valuable (assuming your hunches are correct most of the time). But to operate a medium- or large-size business, you need solid data and a way to extract actionable meaning from that data. Effective use of data is central to many companies’ digital transformation initiatives.
In the beginning, enterprise data lived on database storage servers and file servers in a company’s onsite data center. Although this arrangement gave company leadership peace of mind that their data resided safely within their metaphorical “four walls,” it was also a source of IT challenges:
Whenever any new product is designed, the designers should spend a good deal of time and effort on the product’s user interface. “Should” is the operative term here; sometimes it doesn’t happen adequately if at all.
Some people have it easy.
At a cocktail party, when asked, “What line of work are you in?” they can answer in a couple of words without fear of getting quizzical looks or blank stares. “Accountant.” “Tax attorney.” “Kindergarten teacher.” “Truck driver.” “Software developer.”
Then there are user experience (UX) designers…
The past few years have seen an expansion in top-line business growth along with expectations of bottom-line improvements. Given these pressures, many companies have lost their focus on employee training and development as a strategic component to their corporate success.
Information technology is one of those occupations no one notices until something goes wrong. Few users send appreciative notes to IT managers because the team is doing a great job of keeping the network running. Let the network go down for five minutes and then duck as the nastygrams are fired off.
Most large software projects come with interested parties or “stakeholders.” Each stakeholder wants something slightly different out of the project:
Just in case you’ve been hanging out in outer space for the last few years, IoT is the idea that all kinds of devices—from single-purpose temperature or pressure sensors in industrial machines and home automation devices to shipping containers, automobiles, and street lights—will be connected to each other and to the cloud in an ever-expanding network, both wired and wireless, providing data or carrying out useful tasks in isolation or in cooperation with other devices.
Digital transformation isn’t easy.
The hardest part for most companies has less to do with the technology itself. The hardest part is overcoming the organizational inertia that keeps organizations trapped in their current, antiquated processes. The process of getting stakeholders to examine and rethink their processes is a cultural change that takes time. Organizational change must be nurtured and can’t be imposed.
One of the biggest mistakes business leaders make is assuming what brought them success in the past (business models, core competencies, business processes, and tools) will continue to do so indefinitely.
“Build or buy?”
It’s a common question in business:
Digital transformations are complicated endeavors for any company, but their purpose is simple: improve business performance.
An architect wouldn’t design a custom house for you without knowing something about your family, lifestyle, hobbies, and tastes. These inputs, and more, inform the requirements for the house: how many bedrooms, how many floors, what type of kitchen, and so on. From these requirements, the architect can design the size and arrangement of the rooms and the style of the exterior.
When you need a software solution for your business, you have a lot of options. Traditional choices, such as hiring programmers full-time to develop the tools your company needs are being outpaced. Thanks to our new digital landscape, one of the most efficient methods used to develop software today is outsourcing.
“Digital transformation” isn’t just a buzzword anymore.
Some companies treat it as a buzzword and nothing more. They talk about digital transformation without having any real digital transformation initiatives. These are likely the same companies that talk about their “disruptive technologies” that aren’t disrupting anything.
But an increasing number of businesses are recognizing the value proposition of digital transformation. They’re putting meaningful resources into actual digital transformation projects.
Think about the products or services you use most. You tend to go to the same hairstylists, restaurants, clothing stores, construction contractors, cleaning services, and automobile dealerships over and over again.
Why do you return to them? Sometimes it’s simply a matter of habit. Sometimes it’s good customer service.
How do software development teams make sure the products they release are of high quality?
Not so long ago, a cynical answer to this question would have been something like, “Since when did software development teams care about quality?” It’s unfair, but it’s rooted in perceived quality issues from all types of development teams, from “lone wolf” contract developers to giant software conglomerates.
Software is complex stuff. Even relatively straightforward applications that do only a few things can have a dizzying number of possible “journeys” for users to take. Ideally, every one of those journeys is tested under all possible circumstances to ensure the software works as expected and doesn’t crash, pop-up useless error messages, or provide wrong answers.
If you’ve been paying any attention at all to the world around you, you’ve probably noticed quite a bit of discussion, speculation, and hand-wringing about emerging technologies, such as robotics and artificial intelligence (AI). Much of what’s been written in the popular press and on social media centers around big, scary questions, such as:
76% of IT professionals state that it takes three months on average to develop an app, so it’s important to choose the right app developer. At AndPlus, we are focused on solving complex problems with software, and doing it fast. Clutch, a B2B ratings and reviews site, has released the 2019 Clutch Leader list. AndPlus is happy to announce that we have been named one of the top app developers for enterprises!
Clutch, a Washington, DC-based company, provides fair and transparent rankings of B2B companies. Their team of independent analysts conducts itnerviews with B2B clients and publishes reviews of past projects. These reviews form the basis for their ranking system and serve as a resource for potential customers. We are thankful to all of our clients who have left us reviews and allowed us to maintain a 4.9-star rating!
One of the biggest mistakes a business can make is to assume today’s success will be there tomorrow. A company can thrive for generations, relying on a tried-and-true formula for success. It’s easy to fall into the trap of believing that what has always worked in the past will always continue to work in the future.
All it takes is one disruptive technology, one upstart company that comes out of nowhere with a better, faster, cheaper, more convenient way to deliver the same products and services, to completely upend an established company’s entire business model.
No one can predict the future, so it’s inherently difficult to anticipate these developments. But there are things businesses can do to put themselves in a better position to respond to these developments when they occur. The most important of these is digital transformation.
AndPlus understands digital transformation as the process of organizational change brought about by the use of digital technologies and business models to improve performance. Under this definition, digital transformation must include the following:
It’s a tall order, and not easy to pull off. Many organizations treat it as no more than a buzzword. “All the cool companies say they’re pursuing digital transformations, so we’ll make the same claim,” while being light on the specifics of what’s actually being transformed.
Dig a little deeper and you’ll find those businesses that successfully execute one or more digital transformations are better able to attract and keep happy customers. Their organizations reduce inefficiencies, eliminate cumbersome manual processes, and lower costs while readying for important market changes.
A few years ago, journalist Sara Bongiorni wrote a book called A Year Without “Made in China:” One Family’s True Life Adventure in the Global Economy, about her family’s yearlong quest to boycott Chinese-made products. The author never quite articulates the fundamental reasons for her boycott, nor explores the macroeconomic reasons for China’s manufacturing juggernaut. But she does succeed in making the point that yes, it’s pretty difficult for a middle-class American family to avoid buying products that are made in China or that contain Chinese components or materials.
An old joke in hardware engineering circles says, contrary to popular belief, electronic gizmos don’t run on electricity. They run on smoke. When something goes wrong, the smoke (sometimes accompanied by flames) escapes, and the device stops working.
Buildings are getting smarter all the time.
No, the structures at your local office park won’t be winning spelling bees or appearing on Jeopardy! But they are becoming more automated, energy-efficient, and easier to manage and protect. As a result, buildings are becoming safer, more pleasant places to live and work.
In 2015, cybersecurity researchers Charlie Miller and Chris Valasek demonstrated that a Jeep Cherokee could be hacked and its critical systems commandeered over the internet. They were able to completely disable the vehicle in one scenario; later they showed how they could arbitrarily control the vehicle’s acceleration, steering, and braking. Chrysler recalled 1.4 million of the vehicles to patch the exposed vulnerabilities, at great expense (and embarrassment) to the company.
What’s a computer?
Ask anyone that question, and you’ll probably get variations of, “A machine with a screen and a keyboard and pointing device, used for running various software programs.” This has been the general “high-level” definition since the personal computer became popular in homes and businesses in the early 1980s.
Advertising. Almost every business depends on it, in some form, as a major (if not primary) marketing tool. And for some companies—in fact, some entire industries, such as broadcast media and print journalism—advertising is the main source of revenue. Google could hardly fund its wide-ranging initiatives and services without the money it rakes in from online advertising.
We’ve discussed cross-platform mobile app development quite a bit in this space in the last couple of years. It seems like every time we turn around, there’s another cleverly (if non-descriptively) named framework that claims to overcome the limitations of the others and promises ever-higher rates of code reuse across platforms.
We’ve discussed digital transformation—what it is, why it’s become so important, and how it benefits businesses that pursue it. By migrating business processes to digital platforms—and optimizing those processes, as part of the transformation—businesses realize greater efficiency, productivity, and cost savings.
If you’ve ever sat in the driver’s seat of an automobile, you’ve used the dashboard—or “instrument cluster,” to use the technical term. The dashboard puts all the information you need front and center, so you don’t have to take your eyes off the road to see what’s going on with the car.
Once upon a time, computer programming was done in “machine language;” the low-level instructions that told the computer’s processor exactly what to do. At a somewhat higher level was assembly language, which hid much of the processor-level instructions but was still painstaking, easy to mess up, and hard to debug. Both levels were also specific to a type of processor; programming a computer with a different manufacturer’s processor meant learning a whole new language.
Among the more farfetched artifacts in the Star Trek universe is the medical tricorder. It’s a handheld device that can read a patient’s vital signs and even diagnose diseases when held near the patient. There’s no need for blood draws, biopsies, or waiting for lab results; just “wave the magic wand” and get all the medical information you need within seconds.
We’ve discussed recently the importance of firmware engineering, especially in light of the coming deluge of Internet of Things (IoT) devices.
Today we go into a bit more detail about firmware development: How we got where we are in the evolution of firmware development, some of the main differences between firmware development and PC or mobile software development, and how those differences drive the execution of a firmware project.
Cryptocurrencies, especially bitcoin, are making headlines these days. When speculators run-up cryptocurrency value vs. the dollar, euro, and other government currencies, a speculative bubble results that can eventually burst. The resulting price crash causes significant financial losses to those became players late in the game.
In the 1994 action film Speed, with Keanu Reeves and Sandra Bullock, a homicidal madman plants a bomb on a Los Angeles city bus and rigs it to explode if the vehicle’s speed falls below 50 miles per hour. (Spoiler alert: The good guys win and boy gets girl.)
What are the elements of a successful digital transformation?
That is, what are the qualities, principles, cultural characteristics…and more an organization should have in place to increase chances for success with a digital transformation initiative?
It’s more than digital tools or infrastructure, although these are important. In this article, we discuss 3 elements critical to any digital transformation.
In 1891, at the 10th International Medical Conference, Professor Themistocles Glück of Germany presented the results of an experiment in which he used ivory to replace the femoral head (the “ball” part of the ball-and-socket) of human patients with deteriorated hip joints.
In his 1991 song “Better Class of Losers,” Randy Travis sang that his kind of people were not those who “pay their bills on home computers.” At that time—before the World Wide Web—paying bills on home computers was revolutionary, available only on online services such as AOL and CompuServe.
In the 1890s, a German math teacher by the name of Wilhelm von Osten was convinced that his horse, named Hans, was capable of counting, addition, subtraction, square roots—in short, all manner of math problems. Von Osten would ask Hans a math question, and Hans would tap out the answer with one of his hooves. (Obviously, Hans was limited to positive integers.)
Here at AndPlus, we take a targeted approach to everything we do. Thanks to our fine-tuned, Agile-based development process, our clients tell us we consistently hit the bullseye. That’s why, even though we try to keep sharp objects out of our developers’ hands, we’re excited about a relatively new programming language called Dart.
In mathematics, a radical is the symbol used to indicate the square (or cube, or other) root operation. In chemistry, a radical (sometimes called a “free radical”) is an atom, ion, or molecule that has an unpaired valence electron, and thus is highly reactive. In politics, a radical is one who subscribes to extremist viewpoints. In youth culture, something radical is just far out, dude. Except their way to hip to go to the trouble of pronouncing all those syllables, so radical is just “rad.”
It’s been a while since we talked much about APIs in this space, so it’s probably time we revisited the topic. APIs (short for application programming interfaces) are becoming more important in software development all the time, so it’s a good idea for both techies and their customers to have a little refresher on what APIs are, how they work, and why they’re so important these days.
By and large, computer programmers are, by both nature and necessity, a detail-oriented bunch. It takes someone who can get down into the weeds, the world of bits and bytes, of ones and zeroes, to craft an effective program to make a computer do something useful.
Recently, we talked in this space about user experience (UX) and why it’s so important to get it right when designing, building, and selling a software product. To recap, Google has come up with a handy way to evaluate and give a numerical value to a product’s UX, which helps guide designers and developers in the right direction to make improvements.
In 2016, Apple found itself engaged in a high-profile dispute with the U.S. Federal Bureau of Investigation (FBI) regarding the privacy and device encryption features of the company’s iPhone product line. The specific iPhone in question had been in the possession of Syed Rizwan Farook, who was suspected of conducting a terrorist attack in San Bernardino, California. The FBI recovered the phone after Farook was killed in a shootout with police.
Remember the “telephone” game? If not, it went something like this:
Kid 1 whispers a few words in Kid 2’s ear. Kid 2 then relays the message (again, by whispering) to Kid 3, and so on until the last kid receives the message and says it out loud. Usually, that message is not even close to the original, to the short-term amusement of everyone involved.
When you watch a rocket launch—whether it’s a high-profile NASA Mars mission or a commercial satellite launch by the likes of SpaceX—you’re seeing the culmination of months, sometimes years, of design, development, project management, planning, and execution. The bit where the rocket actually leaves the launch pad and goes into space should be the easy part: Just count down to zero, push a button, and watch it go, right?
There are two basic ways to think about new software versions:
Which one you adopt depends mainly on your attitude towards the software to begin with, and how much effort is required to implement it. The first reaction is reserved for software that you like using and for which you look forward to new features and benefits. Software with easy update paths (for example, those that don’t require uninstalling the previous version, don’t break existing files, and don’t require reboots) also fall in this category. The second reaction is pretty much for all other software.
Today we start a series of blog posts that dive into the design and development process we use here at AndPlus.
Our philosophy at AndPlus follows the Agile development methodology. By way of review, Agile breaks down a development project into short (one- or two-week) mini development cycles called sprints. A fundamental principle of Agile is that at the end of each development sprint, the team should have a working (albeit not necessarily complete) version of the software product.
Humankind had been cooking rice for over 12,000 years before someone had the bright idea of automating the process with rice-cooking appliances. (Why we needed to automate a process that runs mostly without human intervention anyway is a debate for another time.) Although the electric rice steamer (some models of which could be considered “robots” in the broadest sense of the term) didn’t revolutionize the culinary arts the way, say, fire did, it’s just one small example of the surprising application of technology in an area one might not consider “high tech.”
Have you ever used software whose user interface had some or all of the following characteristics?
If so, you’re not alone—in fact, if you haven’t encountered such software, you are in a tiny minority.
Earlier in this space, we talked about the meaning of digital transformation, both for business in general and for AndPlus in particular. For today’s post, let’s dig a little deeper and explore what motivates businesses to seek such a thing.
In one of the many memorable scenes from Douglas Adams’s Hitchhiker’s Guide to the Galaxy series, heroes Arthur Dent and Ford Prefect find themselves on Earth, circa 2,000,000 BC, in the company of a large population of middle managers, hairdressers, management consultants, marketing people…in short, the “useless” third of their home planet who were sent to colonize Earth. In a committee meeting, they discuss the difficulty they’ve had in inventing the wheel. Ford Prefect, exasperated, exclaims that it’s the “single simplest machine in the Universe,” to which a marketing person replies, “All right Mr. Wiseguy, if you’re so clever, you tell us what color it should be.”
Once upon a time, photography was all about delayed gratification. You couldn’t see the results of your efforts until you had taken the film someplace to be developed and printed—a process that could take a day or more. And if you didn’t take many pictures, the roll of film might stay in your camera for months before you finished the roll and took it in for processing. If the photos were out of focus, too light, too dark, or poorly composed, you were out of luck.
As if there weren’t enough programming languages out there, along comes one that has gone from zero to one of the most popular languages in only a couple of years. The language is Swift, designed to succeed Objective-C in the world of app development for iOS devices and their numerous relatives.
The world, it seems, is getting dark.
It’s not for want of sunlight or electricity to power our myriad lighting devices. (Indeed, light pollution is considered by many to be a growing problem with environmental and public health implications.) It’s an aesthetic preference for darker colors in our daily lives. Our kitchen appliances have gone from white to black or unpainted stainless steel. Computer cases, for which “ivory” was once de rigeur, are now almost universally black or some shade of dark grey. Look around on our roads and you’ll mainly see cars that are black (even matte black), grey, or some muted shade of silver.
“My fellow Americans,” said every presidential candidate ever, “It’s time for a change.” Not every presidential candidate has said it in so many words, but when you boil down the rhetoric, that’s what it comes down to. The reality, of course, is things are going to change anyway, no matter who occupies the White House. The only thing about it that stays the same is that the occupant will always take credit when things change for the better and blame someone else when they don’t.
The world of business is rife with buzzwords. Some self-proclaimed business guru comes up with a clever term for a concept that forward-thinking businesses ought to adopt, and suddenly companies large and small start dropping that term into their mission statements, business plans, and marketing materials. Examples that have fallen into and out of vogue in recent years include “synergy,” “game changer,” “thought leader,” “move the needle,” and “right-size.”
Washington D.C. based GoodFirms acknowledged AndPlus and placed it among the list of top mobile app development companies and top web development companies on its research and review platform.
“When you’re hot,” observed country singer Jerry Reed in the early ‘70s, “you’re hot.” An astute commentator on the human condition, Jerry also found the converse to be true: “When you’re not, you’re not.” On its face, it seems so obvious, right? But sometimes it takes a country song to set us straight on these things.
Few technological innovations have been both hyped and misunderstood as much as the internet of things (IoT). For many consumers, the scope of IoT begins and ends with smart-home systems that can monitor security cameras and control lights, locks, sprinklers, air conditioners, and other devices from the homeowner’s smartphone. Certainly, smart-home technology is an IoT application that is easily grasped by the average consumer. But it’s only one example of a technology that has wide-ranging applications and use cases, from agriculture and forestry to climatology, biology, zoology, and more.
When Amazon’s Echo product line first appeared in 2014, its user interface was all about—nay, only about—voice commands to and responses from the device’s natural-language processing personality (known in Amazon’s parlance as an “intelligent personal assistant”) called Alexa.
If you live, work, or just drive around in any large city, you know how frustrating it can be to find a place to park your vehicle. Paying for the privilege is a given; it’s just a question of how far away from your actual destination it will be and how much you will be charged.
From the “It’s Déjà Vu All Over Again” Department
Previously in this space, we discussed the relative merits of web applications in comparison with native apps for desktop platforms, such as Windows and Mac. Today we examine the question: Are there similar advantages and disadvantages with regard to native mobile apps?
The robots are coming! The robots are coming!
Okay, Paul Revere, settle down. Yes, robots of various kinds and with various capabilities are in development. Many are already available, in some form or fashion, and deployed in industries from manufacturing to hospitality and security. And there has been more than a little hand-wringing and scaremongering about how robots, and automation in general, will affect jobs, the economy, and the nature of business itself.
Consider the lowly umbrella: A mundane object, often cheaply made and inexpensively acquired, and with a singular habit of failing to do the job it was designed for. In anything but a light rain that falls straight down, an umbrella—even one of those big golf umbrellas—will keep very little of you dry. And if you’re sharing it with someone, forget it. As the Police sang many years ago, “It’s a big enough umbrella, but it’s always me that ends up getting wet.” It’s a wonder anyone uses the dadgum things at all.
It’s hard to believe, but Twitter is 12 years old this year. Remember when it was new? At the time, a whole lot of people wrote it off as a solution in search of a problem. Who, in their right mind, would want to participate in a service whose only function was to enable people to share their most mundane thoughts with each other, and with the world at large, in 140-character chunks?
The grand vision for human-computer interaction in recent years has been mobility: Users with lightweight, low-power “dumb terminals” communicating with cloud services via ubiquitous and speedy wireless connections to perform every computing task imaginable, from email and web surfing to more computationally intensive tasks such as video editing and big-data analytics. All of this, of course, would be courtesy of the cloud; there would no longer be any need, outside of perhaps gaming, for laptops and desktops with super-powerful, multicore processors.
Fun fact: “dogfood” (one word) has become a verb, at least in the business slang lexicon. “Dogfooding” is synonymous with “eating one’s own dog food,” which in turn refers to a business’s practice of using its own products—the same products it manufactures and sells to its customers—in the conduct of its business. This practice is generally considered a healthy sign for a business—how can you trust a business that uses its competitors’ products? It has a dark flip side, however: “not invented here,” the refusal to use someone else’s technology simply because your business didn’t come up with it, even if your own equivalent technology is inferior or nonexistent.
Computer scientists, by and large, are not considered particularly artistic. When you spend your time in the world of bits and bytes, algorithms and loops, and nodes and edges, you may not think much about aesthetics. To the extent that you do, you might think, “How can I get a computer to create an image or a song or a poem by itself?”
In one of the many memorable scenes in the 1987 film The Princess Bride, the disguised hero, Westley (played by Cary Elwes) rescues the captured princess (Robin Wright) from the evil Vizzini (Wallace Shawn) by challenging Vizzini to a battle of wits involving a bottle of wine, two goblets, and the deadly poison iocane. Westley takes the goblets, administers the poison out of Vizzini’s sight, and challenges Vizzini to drink from one. Vizzini spends several minutes overintellectualizing to decide which one is poisoned, and even switches the goblets while Westley is distracted. Finally, he chooses one and they both drink. Vizinni gloats over his superior intellect until he keels over dead.
As mentioned many times in this space, cloud-based services are becoming increasingly attractive to businesses of all sizes for all kinds of applications, from web servers and e-commerce to big data, machine learning, and the internet of things (IoT). With its convenience, security, flexibility, and low cost, cloud has many advantages over building, equipping, and staffing an on-premise data center.
As if we needed more evidence that machine learning is making its way out of the lab and into the hands of “regular” developers and their applications, along comes PyTorch, a Python open-source package developed at Facebook that enables neural network modeling, training, and testing, with a focus on deep learning and high performance.
To the extent that they understood it at all, corporate executives have often regarded talk of deploying critical business applications and data in “the cloud” with suspicion: “How,” they asked, “do we guarantee security when our applications and data are in someone else’s data center, not ours?”
The term “disruptive,” when applied to business in general and technology in particular, has become something of a buzzword since its original coinage by Harvard Business School professor Clayton Christiansen in the 1990s. Companies in all industries now claim to provide “disruptive” technologies or apply “disruptive” business models or processes.
It’s a strange irony: The more we try to make technology simpler, easier, more intuitive, and more convenient for end users, the more complex it becomes.
Consider the personal computer. The earliest PCs were simple by modern standards, with straightforward hardware architecture and minimally functional operating systems. But the user interfaces (C:\> prompt, anyone?) were opaque to anyone who wasn’t a computer engineer or hobbyist.
At the end of 2017, speculators had run the value of a single Bitcoin to over $18,000—a far cry from the pennies that Bitcoins were trading for just a few years ago. But then the Bitcoin price fell back, almost as fast as it had risen, and at this writing has been trading in the $5,000–$10,000 range for several months. The buzz about cryptocurrencies in general and Bitcoin in particular has faded in tandem with Bitcoin’s trading price.
From the “Be Careful What You Wish For” Department
Can it really have been only a year or so ago that commentators, both in this blog and the mobile development community at large, were complaining about how hard it was to write cross-platform mobile apps, and wouldn’t it be nice if there were some way—any way!—to generate fully native apps for each mobile operating system from a single code base?
You have to be a little bit crazy to be a C-level IT leader.
The CIO or CTO position is a thankless one at best. The only time you’re noticed is when things go wrong. And whether it’s infrastructure, security, or business systems, there are lots of things that can go wrong.
Ever since philosopher and mathematician René Descartes first set quill to paper to draw a line on his newfangled “Cartesian plane” in the 17th century, people have sought ever-cleverer ways to represent data in a pictorial format. The reasons are obvious: A graph, chart, gauge, or map can, at a glance, show important features and trends of a data set that you might miss by poring over tables of numbers. It’s the reason why Edward Tufte’s The Visual Display of Quantitative Information is still considered the bible of data visualization almost 40 years after its first release, and why firms large and small are demanding software “dashboards” showing the real-time health of their businesses.
Do you suffer from any of these symptoms?
The news around machine learning (ML) just keeps getting better, as new and improved tools and techniques become available and more developers (not just computer science PhDs) can gain experience developing ML-based apps. The latest: Apple recently announced the release of the Create ML framework, a set of methods that developers can use to create and train ML models using Apple’s well-known Swift programming environment.
We talk a lot in this blog about programming frameworks and how they help developers do their jobs in various languages. It seems at times that for any given programming language there is an endless list of frameworks available. This is great for us, because it gives us an endless stream of material for the blog.
“Big data”—the gathering, manipulation, analysis, and reporting of data based on one or more data sets that are too large to be managed by traditional means—has had a big problem: Because of the vast quantity of data to be processed, a single computer, or even a high-end virtual or physical server with multiple CPU cores, is not up to the task of processing that much data efficiently. It’s much better to divide the work among several computers or servers operating in parallel.
An old metaphor, intended to explain the concept of “infinity,” states that an infinite number of monkeys, banging away at an infinite number of keyboards, would write code just as well as we humans can, with better commenting.
For much of its history, artificial intelligence (AI) technologies of all kinds have been relegated to computer science laboratories and arcane academic papers. As discussed many times in this space, only recently has the technology, specifically machine learning techniques, advanced to a state where developers at large can experiment with it without requiring a PhD in computer science.
There’s been a good deal of talk, in this blog and elsewhere, about the brave new world of the internet of things (IoT) and how it will transform our personal and business lives. The talk has been accompanied by no small amount of hype, with pundits proclaiming that there will be anywhere from hundreds of millions to trillions(!) of devices connected to the internet in the near future.
Images in apps and web pages are a bit like electricity, or the internet itself: You don’t notice them until they aren’t there. And when they aren’t there, the experience can be unpleasant.
When an image loads slowly or not at all, it’s easy to blame the network connection or the size of the image. However, there’s actually much more to it than that. An app’s ability to load images quickly depends in large part on the efficiency of its image processing routines, which use complex algorithms to load images as fast as possible without degrading image quality.
In the world of software development, enhancements in development tools and platforms tends to be incremental. Certainly, new tools, frameworks, and platforms that ease the job of software development or software project management come along with sometimes mind-spinning regularity, and we have discussed a good number of them in this space. But after that initial release, revolutionary enhancements of those tools in functionality, capability, and ease of use are pretty rare.
In enterprise computing, somewhere between the era of punched-card computers and the rise of the personal computer, there was the heyday of the mainframe and the “dumb terminal”—a keyboard and a monochrome monitor with no graphics capability, no mouse, no speakers, no webcam, no USB anything. One mainframe computer could support a large number of simultaneous users who logged in via these dumb terminals; they neither knew nor cared where the actual computer was located.
In his 2014 song “First World Problems,” “Weird Al” Yankovic sings about someone with issues—among them, “my house is so big, I can’t get Wi-Fi in the kitchen.” A first-world problem if there ever was one. We in developed countries take ubiquitous connectivity for granted, so it’s easy to forget that for over half of the world’s population, internet connectivity ranges from slow to nonexistent.
If you have only recently started hearing about Apple’s CarPlay and Google’s Android Auto, you may be surprised to learn that the technologies have been available since 2014—almost ancient history when it comes to mobile tech. It seems that the technologies have at last become available in enough new car models to make their way into the public consciousness.
A common theme in science fiction is that of robots who are, or somehow become, intelligent enough to have opinions on the way humans are running things; invariably, the opinion is that they don’t much care for it, and they decide as a group to take action in the form of the violent overthrow of their human masters.
By now, you’ve probably started reading and hearing about fifth-generation (5G) wireless networks, how they will enable lightning-fast download speeds and low latency, and how 5G is a disruptive technology that will change everything for everyone everywhere. Oh, and that every mobile carrier is the undisputed leader in 5G technology.
It's wonderful when startups succeed and burst into the limelight, but one of the sad facts of entrepreneurial life is that startup companies often fail. The biggest reason, according to some observers, is lack of a market for the product or service the company is building. But even in those companies that have a compelling idea and large, strong market, startups often fail to deliver a product that lives up to its expectations—or, sometimes, any product at all—before the cash runs out and investors become disenchanted. Many great ideas have withered on the vine for want of a solid product launch.
As you’ve probably gathered by reading this blog, we’re really excited about the future of augmented reality (AR) technology. That’s especially true now that the two biggest mobile ecosystems, iOS and Android, have development kits (ARKit and ARCore, respectively) that enable developers to bring AR apps to the mass market, without having to fuss around learning the science behind AR.
The events of Sept. 11, 2001—as well as those of many large-scale disasters since then—highlighted the shortcomings of the communications systems used by first-responder emergency services agencies. Various police, fire, and emergency medical services personnel operated on different radio communication channels and thus could not share information with each other. Even within agencies, communication channels became overloaded with traffic. It was clear that the traditional network of dispatchers, command centers, vehicle radios, and walkie-talkies was not up to the task, and a better solution was needed.
Johannes Kepler (1571–1630) was, to say the least, a pretty smart guy... in fact, some say he was almost as smart as me. Without the aid of even a dollar-store calculator, he established the physical laws that describe the motion of planets through the heavens. His work predated and inspired Isaac Newton’s development of the theory of universal gravitation.
In ancient Roman mythology, Vulcan was the god of fire and metal smithery. It’s from his name that we get the word “volcano.” Much later, in Star Trek lore, Vulcan was the home planet of First Officer Spock of the starship Enterprise.
We talk a lot about virtual reality (VR) and augmented reality (AR) here at AndPlus, not only in this blog but also amongst ourselves and with our clients. These two related technologies are poised to spur some truly innovative, useful applications—and not just in the gaming and entertainment worlds.
Ask the average person on the street what IBM does, and you might get a blank stare, or perhaps “Didn’t they do that 'Jeopardy!’ thing a few years ago?” Once a household name whose mainframes, PCs, and typewriters could be found in nearly every large company around the world, IBM has mostly fallen off the pop culture radar in the last few years.
You’d better sit down for this.
Google has officially ended support for Chrome Apps on Windows, Mac, and Linux versions of the Chrome browser. The Chrome App store is no more.
Shocking, isn’t it? Try to contain your disappointment.
If you’ve been in a bank lately, you’ve probably noticed there aren’t many tellers—perhaps two or three at any given time, tops. Many banks now encourage their customers to use ATMs instead for most of their banking needs. (Don’t weep for the bank tellers, though -- because of the phenomenal growth in the number of bank branches, there are actually more bank tellers employed in the U.S. than ever before.)
We’ve talked several times in this space about cross-platform development, mainly with regard to mobile app development. The software project management, business strategy, and marketing advantages of being able to develop one code base and release the app for both iOS and Android at the same time are manifold: quicker time to market, better resource allocation, easier testing cycle, more consistent application look and feel, and more.
Congratulations! You’ve developed, designed, tested, and now launched your app, which excited users have been downloading since the release date. While this is cause for celebration, your work isn’t over yet (is it ever?).
Even though your app has launched, you still need to continue to monitor its performance. Doing so on a consistent basis can help deter users from deleting it, let alone bad mouthing it on review sites.
Still, while you may be a pro in the design and development department, you may be a newbie when it comes to app monitoring—or could benefit from a crash course. Either way, read on to learn about app monitoring basics.
In case you haven’t noticed, there’s been a whole lot of progress in the last couple of years to make it easier for developers to deliver applications that work the same across platforms, across devices, and across browsers. The business of software development, it seems, is finally catching on to the fact that customers want the same experience regardless of hardware and platform choice, and developers don’t want to develop and maintain multiple flavors of the same app.
The results are in. For the second year running, the U.S. Chamber of Commerce Foundation has recognized the Boston metro area as tops in the nation for startups.
It wasn’t too long ago when developers like us had to build applications for Android and iOS in entirely different environments, with entirely different enginneering teams. This was soon remedied by frameworks such as PhoneGap, Xamarin and React Native. While we've had success with the three in the past, it always felt like they were missing that special something that makes native applications feel... well... native.
You ever open up instagram ready to consume some Grade-A content but immediately start seeing posts from '5 Days Ago' at the top of your feed? That's because Instagram sucks at algorithms and want to ruin your life.
You ever have a dream about a specific product then about 20 minutes after your REM cycle is over, BOOM it's on your Facebook feed? Yeah that's also an algorithm.
You ever say something out loud then it shows up as an ad everywhere you look. YEP YOU GUESSED IT MORE ALGORITHMS BUDDY.
There’s been a good deal of chatter in recent years about algorithms. This previously esoteric concept has become a household term because algorithms of various sorts are pervading our lives, in often not-so-good ways. There's some useful ones that make our digital experience much better though, for example: Google’s algorithms that determine what sites show up on that all-important first page of search results are closely guarded secrets and the subject of much speculation among search engine optimization experts. Amazon and Netflix use algorithms to make recommendations based on your past viewing or buying selections. The list goes on...and on and on and onnnnnn.
Quick note: this is the final segment to this series. If you haven't seen all the posts click here to start with the first one. There will be a round-up/reflection post up on MIT's blog soon. I'll keep ya updated.
At AndPlus, we’re pretty excited about recent advances in artificial intelligence, machine learning, and robotics. That’s hardly surprising, considering that we’re a pretty nerdy bunch that digs that sort of thing.... and it's literally our job. Of course, as software engineers, we expect to be designing, developing, and using these technologies in new and creative ways.
Like many industries, healthcare is poised to be revolutionized by the Internet of Things (IoT). We already have the Fitbit and similar devices that incorporate various sensors and work with smartphone apps for fitness monitoring. However, there are many more ways that the medical industry can take advantage of IoT devices, and with good end-to-end development, we can expect to see some innovative systems on the market in the next few years.
Here at AndPlus, both in this blog and in our daily work, we talk a lot about robotics. They may not be quite as awesome as they are in the movies... but we understand their importance in the future of hardware/software integration, a field we specialize in. We believe that the convergence of advanced hardware, standardized software platforms and machine learning will bring about practical, intelligent robots that will help us with many, if not most, aspects of our lives—at home, on the road, in the office, on the factory floor, in the warehouse, and in many other settings.
Sup y'all Happy New Year! This is part 2 of a series of reflective posts. Missed the first part? No worries click here.
I've rounded out 2017 by finishing my first course with MIT's Executive Education program studying the Business Implications of Artificial Intelligence. I'm happy to continue my musings on what I've learned and the discussions that took place with my classmates. This 2nd module focused on Natural Language Processing. What is that? What the heck does it do and why does it matter?? Well let's talk...
In the last couple of years, if you were asked to list the top augmented reality/virtual reality (AR/VR) development companies, the list would have included names such as Google and Oculus, but probably not Apple. But with the recent release of ARKit in iOS 11, Apple is shrewdly making a bid to establish itself as a player in the AR space.
Suppose you run the shipping department for a company that ships products all over the world. You convinced upper management to buy a device for your department that weighs each package and detects its dimensions—width, length, and height—and shows the results on a display on the device’s font panel. All the shipping clerk has to do is read the information and enter it into the company’s shipping software, which calculates the rate and prints the shipping label.
Think, for a moment, about the dashboard in the vehicle you drive. Whether your vehicle is a 1973 AMC Gremlin, a yacht, or a Boeing 747, the characteristics are the same. They all provide:
We’ve all seen examples of bad design in our daily lives: appliances that break on first use, floor plans that are difficult to navigate, and of course, computer software that makes tasks harder, not easier.
Today, it seems that almost every software development organization employing more than one programmer subscribes to the Agile methodology. In fact, it’s difficult to find one that doesn’t. Given its pervasiveness now, it’s hard to remember that only a few years ago, Agile was a newfangled idea that only a few development shops were trying, while others dismissed it as a passing fad or were “waiting to see…”
Even before there were digital images and image editing software, graphic artists had an occasional need to extract an arbitrary shape (such as the outline of a human subject) from an image so that it could be placed in other images. In the days of chemical photography, this involved complex darkroom techniques or even painstakingly cutting things out of paper prints with scissors. Imagine fitting that into your schedule these days!
Science fiction is littered with sentient, omnipresent computers that respond to voice commands. From the fatally flawed HAL 9000 of 2001: A Space Odyssey to the cool, confident Starship Enterprise computer to the snarky, easily distracted Heart of Gold computer in The Hitchhiker’s Guide to the Galaxy, humans have imagined anthropomorphic computers that hear all, know all and control all.
As a society, we have grown accustomed to straight-line trajectories for technology: from concept to practical application, relentless performance improvement, and finally to maturity. The internal-combustion engine, personal computer, and smartphone have all followed this path, and have consistently lived up to whatever hype has been generated around them. It’s so common that we forget that not every technology follows this trajectory.
Companies often have excellent in-house staff and resources, but they don’t have the capacity or the skills needed for a particular project. AndPlus is adept at staff and skills augmentation, and we have the results to prove it.
Some things are more art than science. And while software development is definitely a science, testing it has more than a few artistic aspects to the process. Dan Valderrama, QA Engineer at AndPlus, talks about the typical two-week sprint and how the company ensures a quality, shippable product at the end of it.
A long time ago, in a galaxy far, far away, the inhabitants had access to advanced medical care provided by robots. The Star Wars medical droids had the ability to diagnose and treat their patients with extraordinary knowledge and numerous built-in surgical tools. They had extra added bonus skills in reasoning and communication with their patients—any species from any planet speaking any language. Their bedside manner was, of course, impeccable.
“What!? Can’t you see I’m busy? None of project development's stories are getting done on time and user features are still lagging behind.”
“I‘m sorry, sir. I know you're busy, but Mr. Petrov is on the phone. He says he may be able to help.”
“Sergei Petrov, our contact in Moscow.”
“Oh, that's unexpected. What’s he want?”
“He said it’s for your ears only. And it’s terribly important. Something about the MoSCoW method.”
You’ve read a number of articles in this space about different types of machine learning, with a high-level view of how they work and the types of technologies that machine learning will enable in the future. “But,” you’re thinking, “What are some of the ways machine learning is being applied right now, to do useful work outside the laboratory?”