Complex Information & Control Systems
Information technology is an integral part of the whole theory and science of complex systems. It is essentially the only way through which we can access data, process it and infer patterns within these complex systems due to their scale and number of components. This is also true of complex engineered systems. Although they are much more than just information technology they are the product of it and virtually impossible without it. All of this technology that we have developed has to be managed, operated and maintained in some way, and information technology is today essentially the only method for managing the massive technological infrastructure of advanced economies. In the same way that researchers can’t study and interact with vast networks without computation, it is also becoming increasingly unviable for us to interact with these complex engineered systems without being enabled by information systems. The two are critically intertwined.
The information revolution is in many ways the backbone to the current rapid proliferation and transformation in our technology landscape, as we have already noted the information revolution is evolving as the focus moves up from the micro level of individual computerized devices to whole systems of people, technology, and information. We may have a good grasp of the individual computerized components and their internal workings, but we have very limited understanding of the systems that emerge out of the interaction between all these digital devices, people, and physical technologies, and these are what we call information systems.
When we scale the basic operations of computing that involve the storage, manipulation, and exchange of information up to the macro scale, then datasets become big data, no longer a single file on a hard disk, but a cloud of data points streamed from a network of different sources. This data is often unstructured and noisy, such as the millions of images uploaded to the Internet every day, the hashtags on social media, or data from financial trading platforms. At this macro scale, computer programs become advanced analytics, which is a set of algorithms that include sophisticated statistical models, deep learning, and other advanced data mining techniques. In order to reveal patterns in large data sets, we can’t go into the details of how these algorithms work here but at a very high level, they automate the process of turning data into information that can be acted upon.
These machine learning and deep learning algorithms are cutting edge technologies that have only really come of age in the past one or two years, with companies like Google scrambling to get their hands on people with expertise in this area. We have never really had the capacity to automatically turn very large data sets into valuable information before. This technology already provides solutions to many problems in image recognition, speech recognition, and natural language processing. It will very likely be the engine behind a lot of innovative groundbreaking technologies in the coming years. This is the forefront of the information revolution today and it is still a radically disruptive force.
The last technologies that we will briefly mention are social networking and mobile computing. Social networking gives people a presence in this world of information systems and makes explicit our actions. Computing is becoming increasingly pervasive as the digital and physical world are converging. Social networking and gamification are coming out of their box and will become increasingly a part of and overlaid on top of the physical world to give everything a social dimension. This again is very much cutting edge technology but there is growing research on the convergence of the Internet of Things and social networking to give us what is called the social network of things. All of these different technologies combine to give us the acronym SMAC, standing for social, mobile, analytics and cloud. And these technologies are currently at the forefront of shaping our I.T. infrastructure.
The formal definition of an information system is the combination of user, technology and process to complete a given goal. Information systems collect, store, process and exchange information. Today they are used in all forms of organization of any size from enterprise information systems to manufacturing to transportation and urban information systems. Information systems serve a number of critical functions within complex engineered systems. They serve the functions of basic control, automation and of coordination between different systems. We will discuss each one of these functions separately.
A control system is a specialized subsystem for controlling another system. A very high-level generic model of this would consist of a sensor for receiving information about the system being controlled and its environment, a logic unit that processes this information according to some set of instructions and an actuator that executes on the controller’s instructions. In order to control a system, all of these elements need to be present and working together.
How we control technology has, of course, evolved over time. If we think about a hand tool like a shovel, all of these control functions are being performed by the person operating the technology, who is also inputting the physical energy into the system. Industrial technologies and new energy sources removed humans from the direct physical control over the system as it became mediated through mechanical levers. The electrical revolution gave us electronic interfaces but all still largely controlled by a human operator. With the advent of information technology, basic control processes such as on production lines and other industrial processes have become automated.
There are only so many technologies that a single human can interact with and manage. At a relatively low level of technological saturation, such as in pre- modern societies, we can interact with and directly control all the technologies we own. But as we increase the number of technologies and the complexity of the technological infrastructure this becomes no longer possible. To develop the large infrastructure of the industrial age required a certain level of automation, allowing for any single individual to be enabled by much more and diverse technologies. It no longer became possible for us to manually interact with, directly control or even understand all these technologies.
The more technology we have and the more complex this infrastructure becomes the more we need information in order to interact with it and manage it. The advent of digital computing and advanced telecommunication is driving a new level of automation that is required to manage the ever-growing complexity of the technological infrastructure that supports post-industrial societies. A single premium-class automobile may contain close to 100 million lines of software code that are executed on 70 to 100 electronic control units networked throughout the body of the car. The physical operations to whole mass transit rail systems such as that of Dubai have been automated.
Many other basic control process have become automated such as manufacturing processes in factories, switching in telephone networks, steering, and stabilization of ships, aircraft and other applications. General-purpose process control computers have increasingly replaced stand-alone controllers, with a single computer able to perform the operations of hundreds of controllers. Process control computers can process data from a network of PLCs, instruments, and controllers in order to implement typical control of many individual variables. They can also analyze data and create real-time graphical displays for operators and run reports for engineers and management. For example, the Union Pacific Railroad placed infrared thermometers, microphones and ultrasound scanners alongside its tracks. These sensors scan every train as it passes and sends readings to the railroad’s data centers, where pattern-matching software identifies equipment at risk of failure. Increasingly, these systems will be connected to cloud platforms in order to run the kind of advanced analytics we discussed above, as major corporations such as General Electric and Cisco invest heavily in technology.
Information systems also play an increasingly important role in coordinating, load balancing and optimization between disparate systems. With pervasive networking, we can sense our world like never before, get real data about how things are performing, and there is a vast amount of space for optimization both on the micro level and macro level. It is estimated that we waste somewhere between 40 and 70 percent of electricity on the grid worldwide. The cost of traffic gridlock in Europe is estimated to be a few percentage point of the entire GDP and reports have estimated that over 30 percent of traffic in a city is caused by drivers searching for a parking spot. All of these systems could be greatly optimized through common protocols and platforms that enable information exchange and coordination.
As previously mentioned, the industrial model of organization that underpins the infrastructure we inherited was very much domain focused. We have departments for the domain of energy, departments for the domain of water, departments for transportation and so on. What we don’t have is departments for processes that cut across domains, and thus our infrastructure systems may be somewhat optimized in isolation but they are certainly not optimized on the
aggregate level. Getting these different systems to talk to each other is key to developing sustainable systems through energy efficiency and recycling. Smart cities are good examples of this where different systems have to work together to actually make the whole system smart. For example, when an emergency is reported in the city of Barcelona Spain, the approximate route of the emergency vehicle is entered into the traffic light system, setting all the lights to green as the vehicle approaches through a mix of GPS and traffic management software, allowing emergency services to reach the incident without delay. This is a form of the kind of complex cross-domain coordination that is required to make the whole system more efficient and smarter.
Because of the siloed nature of our industrial systems, they are not optimized for how end-users actually use the system, that is, as part of a process. And this is where social networking comes into the mix. By having a digital presence and making explicit our activities, we can begin to design systems for aggregating different services and coordinating them along the processes that people are actually engaged in, and eventually, do this in real-time. Once a process is made explicit, different systems can be notified and begin to coordinate their activity to enable that process to take place in a seamless fashion. These types of adaptive processes that operate in real-time require a very different architectural paradigm that is called event driven architecture.
Event driven architecture
When supply chains or manufacturing processes become networked and can respond to events occurring in other systems immediately, when prices on the electrical grid can adapt in real-time to supply and demand, then things become more contingent upon time and events play out through processes. This is not just about just making things faster because when a system reaches a critical level of dynamism the whole paradigm changes towards one that is process orientated, where systems are driven by event signals in time. And the aim of analytics is to find statistical patterns in these processes so that we are no longer reacting to things that have already occurred, but can be preemptive by preventing them from occurring in the first place.
Lastly, before wrapping-up, we will touch upon the subject of security, which is, of course, a major issue here. And we should not be naïve about the scale of the risk involved as our critical infrastructure becomes automated, networked and remotely controlled via common IP platforms. Today a typical car’s airbag, steering, and brakes can all be hacked and controlled through the Internet for malicious ends. Control systems in nuclear power plants can be broken into and with the roll out of IoT platforms software will soon be permeating all types of technologies as our critical infrastructure becomes increasingly dependent upon it. We can think about security with respect to control in terms of either access to control or the use of control.
Distributed systems like the Internet and IoT drive a new form of security. Traditional security is built around having something to secure, some well- defined information or system that typically belongs to some organization. And we can employ a professional I.T. security team to build a security wall around it because we know what is part of the system and what is not. But in this world of distributed systems like the Internet, we are dealing with billions of devices that may belong to end-users with little awareness or concern for security, and these many exposed and vulnerable end-user devices can be harnessed for distributed attacks. In this way, security can become a tragedy of the commons. It may be of no great value for me to change the default password on my router but when millions of other people do likewise, the net result can be a macro scale security issue with many devices vulnerable to being harnessed for an attack. This is just an example to the nature of security within these distributed systems.
But as mentioned, security is more than just ensuring prevention of hackers from breaking into a system. It is ultimately about the appropriate use of control and power, and with this next generation of information systems we are consolidating and handing over an extraordinary amount of power to these automated algorithms. A system is only really in control when awareness, responsibility, and power are all aligned. This means the exercising of control through a multi-tier framework with more intelligent and aware systems guiding systems that are lower in their capability for information and knowledge processing.
Whereas information and data may be growing at an exponential rate, this only works to make intelligence an increasingly scarce resource, information technology, on the one hand, commodities information and data driving its value right down. But because of this it also increases the value of knowledge and intelligence making them scarce resources. Wherever there is demand for a scarce resource there is a hierarchy based on access to that resource. This drives a new kind of hierarchical structure that is emerging out of the information revolution, captured in the acronym of DIKW, which stands for data, information, knowledge, and wisdom. Controlling these systems in a long-term sustainable and secure way means understanding this hierarchy and building it to our systems of technology so that this world of complex engineered systems that we are going into is governed and controlled by true knowledge and ultimately some form of wisdom.
In summary, then, the information systems that have developed over the past few decades are both a massive added source of complexity within our technologies and also the solution to complexity for end-users. These information systems enable us to harvest vast amounts of data, harnessing it to coordinate and optimize systems. Through the convergence and integration of a number of technologies like cloud computing, analytics, pervasive sensing and social networking, we are reshaping our technology infrastructure to make it more adaptive, process orientated, dynamic and real-time. It gives us the capacity to greatly increase the efficiency of our systems of technology through automation and real-time coordination happening with a new event-driven architecture. But it also helps our many security concerns that require intelligent design and management to achieve long-term sustainable solutions.