Programmer from hell plants logic bombs to guarantee future work

Credit to Author: Mark Stockley| Date: Tue, 23 Jul 2019 13:10:13 +0000

If you’ve spent any time working with computer programmers then you’ve probably been part of a project that, for one reason or another, just seems to have too many bugs. No matter what you do, you can’t make progress: there’s always more bugs, more rework and more bugs.

At some dark moment, as frustration at the lack of progress gnaws away at you, you may wonder: what if the programmers are adding the bugs deliberately?

If that’s occurred to you then you can bet that the programmers, who tend to be intelligent bunch, have let their minds wander there too. Mine certainly has. Like me, they will have noticed that the incentives often stack up in favour of mischief: the work is often thankless, the code unsupervised and the money only good for the length of the project.

Thankfully, most of us are too morally upstanding to go there, but every barrel has its iffy apple.

In this story the barrel bears a Siemens logo and our apple is contractor David Tinley, who recently pleaded guilty to one count of intentional damage to a protected computer.

According to filings by the United States District Court for the Western District of Pennsylvania:

TINLEY, intentionally and without Siemens’ knowledge and authorization, inserted logic bombs into computer programs that he designed for Siemens. These logic bombs caused the programs to malfunction after the expiration of a certain date. As a result, Siemens was unaware-of the cause of the malfunctions and required TINLEY to fix these malfunctions.

The logic bombs left by Tinley were bugs designed to cause problems in future, rather than at the time he added them. He might have done this to avoid looking like the cause of the kind of grinding, bug-riddled, non-progress I described at the beginning. Or perhaps he thought Siemens was less like to give up on buggy code that’s been deployed than code that’s still in development.

Law360 reports that he would fix the bugs by resetting the date the logic bombs were due to go off, and that his attorney argued he did this to guard his proprietary code rather than to make money.

It goes on to describe how Tinley was exposed after being forced to give others access to his code while he was on vacation. Siemens, it says, had to fix the the buggy system without him in order to put a time sensitive order through it.

According to court filings, Tinley worked as contractor for Siemens for fourteen years, between 2002 and 2016, and engaged in his unorthodox income protection scheme for the last two.

He faces sentencing in November.

What to do?

I suggest that if a contractor is refusing to let you see their code, or doesn’t trust you enough to give you access, that should raise a red flag for one of you. And if somebody is making themselves a single point of failure, you have a problem, even if they aren’t doing anything malicious.

In my experience, programmers are vastly more interested in fixing things than breaking them though, and most projects have a plentiful enough supply of accidentally introduced faults.

That said, programmers and their code both get better with peer review, and modern development practices like continuous test and build cycles are designed to surface bad code as quickly as possible.

So, while I don’t think you should do either of those things to root out bad apples, there are good reasons to do them anyway, and if you do you’ll stand more chance of catching saboteurs.

http://feeds.feedburner.com/NakedSecurity

Leave a Reply