Thursday, March 26, 2026

Technology Should Recognize Human Frailty

My drone controller has two control sticks. They screw on to the controller near the top. Before I pack away the drone and controller, I must unscrew the control sticks, tuck them into a holding slot and then slide the controller into my drone satchel. Why do they make me go through this hassle each time? Because if the satchel got bumped hard or squeezed between other luggage, those control sticks are perpendicular to the rest of the controller and would be a weak spot that could easily break. During normal use, it would probably be fine, but this ensures no failures even with unexpected rough and tumble. 

What are the chances I drop one of these small sticks while setting up? Maybe I am just clumsy, but in my experience so far, chances are about 1 in 5. I am often out in a remote area to fly my drone, so when I get the controller out, I am mindful to remove and attach the control sticks while standing over a flat, wide surface. That way, if I accidentally drop one, it is unlikely to roll away or fall into a crack. I rarely drop one, but this practice ensures no unexpected failures. 

I see this as a case of the engineering maxim: “Hope for the best, but plan for the worst.”

Planning for the worst means anticipating abnormal and faulty behavior. Such planning is a hallmark of good engineering and is at the heart of safety-critical engineering. While it is relatively easy for me to anticipate the abnormal case of dropping a control stick, many of the systems we engineer are massively complex. Then it is not trivial to enumerate all the ways things can go sideways. As an example, let’s look at modern software.

Modern software is complex

To quantify how big modern software has gotten, I will use the traditional measure – in units of Apollos. That is, if the US 1960s spacecraft had around 30,000 Source Lines of Code (SLOC), then how many “Apollos” is a modern piece of software? The Operating System (OS) on your device (whether a smartphone or laptop) is in the 10s of millions of lines of code, or over 500 Apollos.

Consider the Linux kernel. It has surpassed 40M SLOC. Where does all that code go? The Pareto Principle states that roughly 80% of consequences or effects result from only 20% of the causes. Likewise in many software programs. Only a small portion of that OS code is exercised regularly. Much of it is rarely used, and on any specific machine, much of it may never be used. 

Steven H. VanderLeest, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

You can see in the Sankey diagram that the lion’s share of the Linux kernel goes to drivers. Many of these drivers are only needed on a select few target machines. (Kind of like in typical church congregations where 20% of the people do 80% of the work!)  

On the one hand, this uneven distribution means that the open source community can focus on ensuring the quality of that central core of the kernel. On the other hand, this means some bugs may lurk in the less used (and thus less examined) code. Whether they are software bugs or design miscalculations, subtle errors can be difficult to detect even with thorough testing, especially when the flaw only presents in unusual corner cases. The behavior of complex engineering systems can be very difficult to predict comprehensively. Even when the individual components are simple, the sum of the parts can exhibit unforeseen emergent behavior.

Beyond the Blueprint: The Secret Life of Complex Systems

Our designs often do not behave exactly as we intend. Abnormal behavior that we do not foresee can lead to unanticipated consequences that cause harm. This lack of foresight can be traced back to two fundamental characteristics of our human identity: we are finite and fallen.

Finite

Engineers who design technology (such as the Linux kernel) and people who use technology (all of us) are finite. Humans are finite – not as a result of the fall into sin – but from the very beginning. God created us; we are creatures. 

God intended the human characteristic of finitude. We are engineered with this inherent design characteristic. Although made in the image of God, we do not share God’s infinite qualities of omniscience and omnipotence. We thus do not have perfect foresight and cannot anticipate all consequences for our designs. Our brains have a limited capacity. 

Even when we augment our thinking by working with teams of naturally intelligent people using tools like artificial intelligence, our collective capacity for foresight is still finite. 

Because every creature, and indeed all of creation is finite, certain trade-offs were necessary even in the beginning, before the fall. Even in the garden of Eden, the design of a bridge would require choosing the right materials, trading off load capacity with total length, and so forth. That finite nature of our materials holds true today: trade-offs are a natural part of the design process.

Fallen

Both the engineers designing technology and the people using it are fallen. Humans are corrupted by sin, and all creation with us. Sin taints our thinking and our desires, and thus it taints our engineering design. Sin taints the thinking and desire of the users of technology.

God did not intend this human characteristic of fallenness. However, when the first humans sinned, although not God’s plan, he provided a new redemptive plan to address the flaw we introduced. 

Until the final restoration, sin taints creation – though it is often hard to separate out the characteristics of creation that are finite  from those that are fallen. To act as Christ’s redemptive agents in this world today, we require the discernment that comes from the Holy Spirit so that we can identify the original creational good and work to root out the corruption of sin. 

There are many virtues we could bring to bear in designing technology that properly recognizes both these characteristics. As an example, let’s consider the virtue of humility and see how it undergirds one particular avionics technology.

Case Study: Avionics Partitioning

A new avionics standard was published in late 1996. In that year I was an Assistant Professor of Engineering at Calvin College (now University). I wanted to keep my teaching fresh and relevant, so I pursued engineering consulting work during the summers. That particular summer I landed a part-time position at Smiths Aerospace in Grand Rapids. One of my tasks was evaluating the proposed international standard, ARINC 653. Little did I know then that I would continue working with this standard for much of my career. Thirty years later, just before my retirement, I have had the privilege to co-chair the ARINC 653 standards committee.

The ARINC 653 standard was first published in the fall of 1996, but my manager back then at Smiths had an early draft of the standard for consideration over that prior summer. I was part of a three-person team that he asked to evaluate the proposed standard. It would be used on a new powerful centralized avionics computing system. The new approach would consolidate numerous legacy federated computing systems into a single Integrated Modular Avionics (IMA) platform. Since each federated system was a separate box in the aircraft with its own power supply, enclosure, and connectors, this consolidation would substantially reduce the Size, Weight, and Power (SWaP) needed to execute all the software on a modern aircraft.

The substantial reductions in SWaP provided by IMA increase the range of the aircraft while reducing its cost. The drawback is that software previously segregated on physically separate computers is now integrated on one. This raises the possibility of unanticipated interactions between independent functions. The ARINC 653 standard, coupled with another standard, DO-297, describes how to continue segregating independent functionality using robust partitioning. 

System engineers map each independent software program to a partition. The partition is allocated a certain portion of the resources. The resources are provided by the computing platform, but shared among multiple partitions. This sharing can be done in one of two ways: time partitioning or space partitioning.

Steven H. VanderLeest, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

Time partitioning gives a partition exclusive access to a resource but only periodically. Think of a teacher who gets to use a classroom for a 9am class, but gives it up at 10am. Likewise, the partition gets to use a processor core for the first 10ms of every 100ms, but gives it up for the rest. 

Space partitioning gives the partition continuous (rather than scheduled) access to a resource – but not the whole resource, only a portion of it. Think of a student who gets one locker out of a bank of lockers, but has exclusive access to that one locker whenever they want. Likewise, the partition gets to use certain pages of memory that contain its data, which no other partition may read or modify.

Partitioning as Design Humility

By partitioning all shared computing resources in time and/or space, we can have high confidence that one partition cannot influence the intended behavior of another partition. Avoiding unintended interactions and reducing the chance of unanticipated changes in behavior is important for safety-critical systems like avionics.

Partitioning demonstrates humility in our design. As an engineer of any stripe, but especially a Christian engineer, I ought not put undue confidence in my own ability to comprehend complex systems and foresee all the ways things could turn out. 

Partitioning is a way to recognize my finite nature. By segregating each independent function, I can now simplify my analysis by restricting myself to thinking about that function alone – knowing that the other functions cannot impede or alter the function I am examining. I might not be able to wrap my finite mind around 50 complex software programs all combined into one monolith, but I have a fair chance at comprehending each one individually. Partitioning allows me to divide and conquer.

Partitioning is also a way to recognize the fallen nature of humanity. A negligent or malicious design of one function is now isolated in a partition so that it cannot harm the operation of other partitions.

Conclusion

If you are an engineer, whether designing software, automobiles, bridges, or some other technology, consider humility not only as a personal virtue, but one that can become part of your work as well. Humbly consider all the ways you yourself could be biased against seeing a flaw. Humbly consider all the negligent and malicious ways a user could abuse the product beyond its intended use. Designing technology in humility makes it safer for all. If our systems are too big for any one mind to hold, is it arrogant to build them without partitioning (or equivalent)?



Thursday, March 5, 2026

Welcome to the Fishbowl: Do we have the Right to Privacy?

A distraught woman wrote Dear Abby, worried that she had made some unflattering comments about her daughter-in-law to her son. 

Couple approaching a video doorbell
She got caught because the comments were recorded on their Ring doorbell, which the daughter-in-law heard later. The famous advice columnist replies that the mother-in-law has “learned the hard way that in our technological society, privacy is history.” 

We can no longer assume our conversations are private at someone’s doorstep. Before 2013 we assumed our phone calls were private. But then Edward Snowden debunked this misplaced trust. He leaked information about a government program to collect broad swaths of data regarding the phone calls of its own citizens. The existence of such programs was previously denied by US intelligence officials. James Clapper, Director of National Intelligence, justified his original denial that the government collected such broad data by explaining he was forced to use the “least untruthful” statement in order to keep the program secret. After the program was outed, some of the same officials told the American public not to worry – they weren’t actually listening in on our phone calls, merely recording the time and destination of the call. However, given that officials  felt compelled to tell “untruths” about the programs in public testimony before Congress, it was hard to discern whether the later statements might be true or again, the “least untruthful”. Stories about our lack of privacy seem to come out weekly. We are being watched: by social media, by CCTV, by drones, by doorbells, and more.

Fish Bowl
Stories of close electronic scrutiny in our everyday lives remind me of “The Dead Past”, a science fiction short story by Isaac Asimov. The protagonist is a historian, desperately trying to gain access to a chronoscope (a sort of time machine that lets one see any location in the past), in order to study the history of ancient Carthage by direct observation. However, the instruments are controlled by a heavily bureaucratic government. After years of red tape and rejections, he builds his own chronoscope -- only to have it quickly confiscated by government agents. It turns out that the instruments have poor resolution and cannot look very far into the past. The government keeps the machines under lock and key because they realize the implications for privacy:  the past begins immediately after the present. Thus, one can observe another’s private behavior in the past, but the past is simply moments ago. That is, the instrument observed events in nearly real time. The past was not so dead after all!  The story ends with the inadvertent publication of simple instructions for building a chronoscope and thus privacy is destroyed for all: “Happy goldfish bowl to you, to me, to everyone …”. 


An NSA program to spy on the public is the first step to living in such a fishbowl. Face recognition technology and ubiquitous recording devices give many public and private institutions an extraordinary amount of intelligence about the ordinary citizen. When only privileged people in power have access to this intelligence, such power can easily be abused. It is thus worth examining more closely what precisely is the nature and origin of the so-called “right to privacy”. 


Cultural Origins of the Right to Privacy

The US Constitution does not have an explicit right to privacy. However, over the last century US courts have interpreted several clauses in the Bill of Rights to include privacy, particularly the 4th Amendment’s prohibition against unreasonable search and seizure and the 14th Amendment’s prohibition on limiting one’s liberty (extended to include privacy) without due process of law. Other nations have followed suit, giving limited privacy protections to citizens because such benefits have been collectively endorsed by society. For example, the Charter of Fundamental Rights of the European Union states that “Everyone has the right to respect for his or her private and family life, home and communications.” (Article 7) and “Everyone has the right to the protection of personal data concerning him or her.” (Article 8)

There are legitimate reasons to keep personal information confidential. Privacy helps prevent identity theft. Privacy prevents stigma because of medical conditions. Privacy protects intellectual property, such as a trade secret – the “secret sauce” ingredient in a company’s flagship product.

The secrecy of our data is valuable to us because of the potential harm that comes with its public release. It thus represents a kind of power. Identifying information enables us to conduct business and obtain services. We share certain information with selected organizations in order to confirm our identity. As long as only the two parties (you and the selected organization) know that information, it serves as your ID. 

However, once you or any of those organizations lose control of that information and it falls into the wrong hands, your ID is no longer secure and others can successfully impersonate you online. Thus a thief who steals your identity holds power over you. Likewise, an unscrupulous person who learns of your confidential medical condition could use the power of that information to blackmail you, shaking you down for cash or favors in order to keep the information from going public. Stealing intellectual property such as an invention idea is truly theft because it robs the owner of the full value of the idea.

While this section briefly outlined the social foundations of the right to privacy, it is also worthwhile for Christian readers to consider whether biblical foundations also support privacy.

Biblical Origins of the Right to Privacy

It turns out that scripture doesn’t have much to say about privacy. First, let’s look at a couple spots that hint at privacy but really seem to be about something else. 

We could perhaps infer such a right from the commandment against stealing, interpreting stealing to include the theft of someone’s intellectual property. However, that may be a stretch, since this commandment seems more about justice than privacy.

Another place where we might infer a right to privacy is the Sermon on the Mount. Jesus exhorts us to keep certain acts secret (out of the public eye), including our giving  (Matthew 6:3) and our prayers (Matthew 6:6). However, in both these cases, the purpose of privacy here is not about the power of someone else because they know confidential information. Rather, the purpose of privacy in these cases is to avoid prideful pretentiousness. Giving or praying publicly is to impress people rather than God. Giving or praying in private is directed toward God instead of fellow humans. 

In the same hilltop sermon, Jesus tells us to avoid judging others, lest we ourselves be judged (Matthew 7:1). His mandate recognizes that we only have a partial picture of our neighbors, and it is wrong for us to judge them without fully knowing their circumstances. Thus, there is an implied value for keeping information about others private and not gossiping about it. Paul repeats the call to avoid judging. “Therefore judge nothing before the appointed time; wait until the Lord comes. He will bring to light what is hidden in darkness and will expose the motives of the heart.“ (1 Corinthians 4:5, NIV)

Albert Borgmann notes the connection between privacy and judgmentalism:  “...Thomas Huff has helpfully isolated the notion of privacy as freedom from intrusions that can lead to an unwarranted judgment on the person whose sphere of intimacy has been invaded. Of course, our next of kin, who are naturally members of our personal circle, and our friends, whom we have invited into it, are entitled to judge whatever we do. No one else may without our permission.” (Albert Borgmann, Power Failure: Christianity in the Culture of Technology, Grand Rapids: Brazos Press, 2003, p. 40.)  However, Borgmann then observes that we often use privacy to shield our consumerist behavior from the prying eyes of others. “What Huff calls the privacy norm is in large part the collective affirmation of consumption as an exercise of freedom that would be encumbered by judgmental intrusion.” (p. 43)  

Materialism is not the only bad behavior we attempt to keep secret. Most sins are private affairs that would shame us if made public:  adultery, domestic abuse, addictions, and the like. 

Privacy as Cover

Modern technology can afford us privacy in the form of anonymity on the web.  However, this privacy can be used to shroud illicit acts. The shroud can hide the sin or hide the sinner.

Hidden Sins of a Public Person

We are all public persons in one way or another. We may not be celebrities, but we are known, and thus “public” to our friends, family and colleagues. We value what others think of us, so we cultivate a certain public image. When we use privacy to hide shameful behavior that could tarnish our image if it became known, the technology of anonymity becomes an enabler of sin. 

The perception of electronic anonymity facilitates bad behavior on the web, such as online affairs or gambling. Ironically, people may turn to these vices in trying to find fulfillment. Yet the biblical book of wisdom tells us the opposite results:  "Whoever conceals their sins does not prosper, but the one who confesses and renounces them finds mercy.” (Proverbs 28:13, NIV)

Public Sins of a Hidden Person

Some sins are public by their nature. In these cases, anonymity shrouds the perpetrator rather than the sin. 

An example is cyberbullying or anonymous revenge porn, where a break-up leads to an angry man posting risque pictures of his ex-girlfriend that she shared with him when they formerly trusted each other. This sin (of posting the pornographic pictures without permission) is perpetrated publicly while often keeping the perpetrator hidden. 

Why is bullying wrong? ”With the tongue we praise our Lord and Father, and with it we curse human beings, who have been made in God’s likeness. Out of the same mouth come praise and cursing. My brothers and sisters, this should not be.” (James 3:9-10)   The apostle reminds us that the person we bully is made in God’s image. We must treat all humans with the respect due image-bearers.

Using Privacy with Care

Our legal right to privacy is not absolute -- one’s privacy can still be invaded if warranted, i.e., if due process is afforded to ensure the invasion is justified, in the judgment of a fair and unbiased court. This is important to prevent abuse of those rights. 

Likewise, any biblical basis for privacy is limited. And certainly if privacy is a cloak for sin.

  • "...And know that your sin will find you out.." (Numbers 32:23)
  • “Nothing is hidden that will not be revealed, and nothing is secret that will not be made known. So then whatever you have said in the dark will be heard in the light, and what you have whispered in private rooms will be proclaimed from the housetops.” (Luke 12:2–3)

Accountability to others depends on their ability to regularly observe our behavior. However, privacy allows us to hide our behavior. While there might be legitimate reasons for keeping that communication and data out of the public eye, how do we avoid the temptation to use privacy to hide our bad behavior?  Here’s a check. Would you dare let a trustworthy friend review your past week’s email or web browsing history?  

As engineers designing technology, are we making it too easy for people to live double lives?  Do we enable people to have a public face of righteousness with a technologically hidden face of wickedness?

Our Christian faith should make us cautious when exercising and enabling the privilege of privacy.  Privacy is too often merely a pretext to keep our sinful ways out of the light of day.  “It is shameful even to mention what the disobedient do in secret. But everything exposed by the light becomes visible.” (Ephesians 5:12-13, NIV)