We use cookies to measure site performance. No personal information is collected. By using this site, you agree to our use of cookies.

Newsroom

2024/08/14 (updated 2024/09/24)

Anyone can build complicated. Our actions are determined by simplicity.

Keep it simple stupid!

Advanced lightweight materials, such as these aluminum extrusions, are only part of the solution

Trucks are heavy. This is a simple statement of fact. They have always been heavy for their entire history, and while there have been some modest steps towards weight-saving, they are largely token steps which don’t really address the core of the weight problem.

But why is this a problem? We all have a mental model of a truck, and that mental model is everything is beefy and heavy — it obviously needs to be tough to handle the rigours of truck life. But is this actually true? The problem with heavy trucks is that, to put it simply, to move additional mass over a given distance at a given speed requires substantial additional energy. This energy must be stored somewhere, and then it takes further energy to move the additional mass, and so on, in a vicious cycle¹. This results in significant built-in inefficiencies in the logistics sector — not just the environmental and emissions, but also economically. When we pay for a truck to be on the road, to be delivering goods, most of what the energy cost we’re paying for actually goes to propel things which aren’t the final product, i.e., the actual goods — instead most of it is being used to move, well, bits of truck.

Thinking about it, this is actually a major issue as we are paying for this as a society — we pay for it economically, with our wallets, to move around a bunch of dumb weight that isn’t actually useful, we pay for the cost models of trucks associated with their bills of materials and the capital expenditures to manufacture them, with the wear and tear on our roads to bear the weight of the heavy trucks, and we pay for it ecologically, with the additional emissions to move the weight around, with the additional emissions to manufacture all that stuff in the first place and to deliver it to the factory², and with additional tyre and brake consumption, and so on, all these things that come along with the weight. And we, as society, have decided that this situation is acceptable, and have done so for decades!

In the context of anthropogenic climate change, and in the context of rising energy costs (and cost of living), this is no longer acceptable.

So what is the answer?

Famed automobile engineer and founder of Lotus Cars Colin Chapman understood precisely this problem in the 1960s — his famous axiom about building cars, was to

Simplify, and add lightness.

While the essence is simple — use the least amount of parts to achieve the best performance — there is a world of depth in these four words; let us unpick them.

Simplify: If a part is not there, it doesn’t take up: mass, volume, cost, materials, energy, overhead³.

Add lightness: Optimise for best performance, to make it as light as possible while meeting the safety, technical, and commercial (price) requirements. Generally a component using fewer materials will cost less than a component using more materials.

This approach led to Chapman’s groundbreaking racecars (and to the founding of Lotus Cars as a way of financing those racecars). But contemporary with Chapman, this same logic is the underlying principle and philosophy that was adopted wholesale in the aerospace industry, whether that was commercial aircraft, military warplanes, or rocketry, and is now applied widely within the passenger automotive industry and even to the passenger rail industry. So why hasn’t the truck industry adapted to this axiom?

A World Championship winning car, the Lotus 25 was the first fully stressed monocoque chassis to appear in F1. An early brainchild of Chapman’s, it incorporated his philosophy of lightweight design and superb handling.

A diesel truck is largely insensitive to the requirement to store more energy, because of the enormous latent energy carried in diesel fuel, the inefficiency of accessing that energy (i.e., most of the energy carried in fuels such as diesel is wasted), and the simple way to add more energy (add a larger fuel tank).

However, an electric vehicle is hugely sensitive to energy efficiency, because batteries don’t store as much energy as liquid hydrocarbons like diesel or gasoline do — the best lithium batteries in the world today still only store about 1/10 the energy per unit mass compared to the enormous amount of energy found in the bonds of liquid hydrocarbons.

How to solve this problem of improving systemic energy efficiency in an electric vehicle, without going to complex and complicated Rube Goldbergian style solutions? The simplest solution is to save weight by eliminating unnecessary material through sophisticated and smart design.

This is a difficult thing to do, but it is possible, and it allows for a whole lot of advantages.

There are a number of factors that result in heavy trucks, but fundamentally, the existing paradigm dictates the requirements for a comparatively heavy powertrain and chassis (with respect to usable payload weight).

Conventional powertrains, whether diesel or electric — as described in one of my earlier pieces — transmit the power mechanically, like a Victorian mill, using shafts and linkages — these have to be very heavy to transmit the high levels of power necessary to propel the weight of the truck + the weight of the payload. This power is rotational — that is to say, torque — and so there is an awful lot of rotating mass, with the goal being to eventually turn the road wheels at some point.

This imparts significant twisting forces — torsional loads — to the chassis, resulting in a beefed-up chassis to counteract these massive forces (the chassis needs to be strong enough to withstand these forces without twisting itself up).

Animated GIF image demonstrating the relationship between force (F), torque (τ), linear momentum (p), angular momentum (L), and position (r) of rotating particle. Internal Combustion Engines produce useful torque only over a limited range of rotational speeds (RPM) while electric motors tend to produce maximum torque close to zero RPM.

With a heavy chassis and heavy powertrain as the foundation, there is no way for a conventional truck design to be adapted to be lighter weight, beyond very marginal gains. Since the chassis is undergoing this flex all the time, it dictates the need for a separate cab and a separate payload box, which themselves need to be self-supporting structures which are built on top of other structures and so on. Finally all of these components need to be fastened together, usually using nuts and bolts, and so there is the weight of the fasteners, the cost associated with manufacturing, purchasing, and storing those fasteners, and then the labour associated with installing these (and all those failure points that are introduced!). All of these inherent inefficiencies are, as I wrote earlier, paid for by society.

The most pertinent observation here is that neither cars, nor planes, nor passenger trains are bolted together anymore — this was one of the first things that the aerospace industry did away with, all with a view to weight-saving to improve performance.

At Bristol Superlight, we brought an aerospace type design philosophy to the commercial vehicle industry. This is apparent not just in our approach to vehicle controls, but also our physical architecture, including the materials we use, and the technologies and techniques we use for joining those materials together.

The very first thing we did was simplify — we eliminated the entire mechanical powertrain. We replaced it with electronics, which do not weigh much, and with software, which weighs nothing. We did this to liberate huge amounts of efficiency, since we no longer had to deal with mechanical transmission losses. This further allowed us to eliminate a whole bunch of material that was no longer necessary to counteract the twisting forces that are inherent to a mechanical powertrain layout.

The advantage of not installing the material in the first place, is that the material does not need to be:

  • Manufactured, with the economic and ecological cost of manufacture
  • Purchased, alleviating time and administrative expense
  • Transported to its place of installation
  • Handled and stored
  • Installed
  • Carried around for the service life of the vehicle, with the attendant energy, economic, and maintenance and repair costs
  • And if it is not there — it cannot go wrong or fail!

The next step was to add lightness. This entailed using the material we would carry as intelligently as possible — and ideally to use that material for more than one purpose, at the same time. This is most evident in the structures we can design. Lightweight structures can carry significant payloads and can withstand enormous forces (such as in a crash). Cars, aircraft, and even commuter trains have undergone a transition to lightweight multipurpose structures known as monocoques.

Source: Barnes Wallis Foundation. The spaceframe of a Wellington Bomber in 1939. Spaceframe structures were originally developed for the aerospace industry as way of introducing strength and lightness. A spaceframe architecture is integral: it carries its own weight and provides its own strength. It is a lighter, stronger structural architecture than body-on-frame.

Monocoques are self-integral structures which fulfil multiple functions — in road going vehicles they carry the weight of the payload (and overload conditions), but they also withstand potholes and kerb strikes, they are designed to protect drivers and occupants and pedestrians through what are known as “crash structures”, and so on. The use of multipurpose structures — as opposed to single-function structures (which, as the name suggests, perform a single function i.e., “carry the payload” “compartment for the driver” etc) drives down weight, and improves system efficiency (and in the case of a truck, increase the available payload capacity to improve the ratio between “truck” and “payload”), but the use of fewer materials also drives down the cost-per-vehicle as well as the capital cost required to manufacture the vehicle. There is a reason the entire passenger automotive industry has transitioned in the last few years to multipurpose monocoque structures!

It is clear that lightweighting is a virtuous circle, with many advantages. Yet why is everything on a truck so darn heavy? It does not need to be this way — and we’ve proved it.

¹As I mentioned in Lost in Transmission, Tsiolkovsky’s Rocket Equation (or Tyranny of the Rocket Equation) dictates that the more fuel you have, the more fuel you need (i.e., fuel needs to be added to propel the fuel that is added to propel…) ^

² On more heavy trucks! ^

³ Elon Musk (in the context of SpaceX rocketry) recently restated this as, “the best part is no part”. ^

⁴ Upwards of 80% of the energy stored in liquid fuel such as diesel is wasted in a commercial vehicle on, to misquote Shakespeare, “sound and fury” — i.e., as heat (engine gets hot) and sound (engine is noisy). ^

⁵ As Korolev implies, complexity is not the same as sophistication. ^

⁶ Amusingly enough, the entire way trucks have been built for the last century is the very antithesis of Korolev’s or Kelly’s observations — liquid fuel is squirted into chambers which, under compression, self-ignites, with the force of the explosion causing a very complicated sequence of parts to convert reciprocating motion into rotational motion, and then transmit that through a series of gears, shafts, and linkages, to the point where it is then perpendicularly delivered to the road with a complex invention (the differential) created by Leonardo da Vinci in the late 1400s. Phew! ^



2023/07/06 (updated 2023/07/06)

Give me a long enough lever and a fulcrum on which to place it, and I shall move the world.

  • Archimedes

The Apollo program’s Lunar Module was the world’s first digital vehicle. When Neil Armstrong “took control” to land on the Moon, his control inputs went into the incredibly sophisticated Apollo Guidance Computer, which interpreted those inputs, and then controlled parameters such as yaw, pitch, and roll, which allowed Neil (and those who followed him) to safely descend to the Lunar surface.

The AGC was a marvel of engineering, a miniature digital computer that was a key enabler for the entire rest of the program. Despite the Saturn V rocket, the infrastructure, the Apollo spacecraft — without the AGC, the landings would never have happened.

Photograph of the dual NOR gate chip used to build the Block II Apollo Guidance Computer (ACG). Source: Grabert at German Wikipedia., Public domain, via Wikimedia Commons

One — of the several — reasons why the Soviet Union would never land humans on the moon was because of the lack of sophistication of their system controls (and the subsequent total lack of software¹). The Soviet equivalent of the AGC was the Globus ИНК (INK), which was a terrifically complex² analogue (mechanical) computer with gears, differentials, and cams, all intermeshing to provide a navigational output which a cosmonaut could use to figure out where they were, and where they would be likely to land. The Globus was fundamentally inflexible, in a way that the AGC was not. The Globus could only predict a spacecraft’s position based on predetermined inputs (unlike the AGC, the Globus couldn’t take inputs from sensors to determine a true position); and the Globus could only perform the function which it was pre-set to perform³, and this meant that many of the feats of the Apollo program which permitted its success would have been impossible to accomplish.

The Apollo Guidance Computer was introduced in 1966, at the core of the Lunar Module design — a design paradigm that was digital-first, and unashamedly so, in the sense that the design would not have been possible without the AGC. And while more or less every road-going vehicle today features embedded computers, all too often these have been introduced as a means of replacing a mechanical component or system with something of equivalent functionality for reasons of either cost reduction or sometimes improvements in efficiency. As a result, the true advantages of a digital-first design cannot be taken advantage of. These vehicles are stuck using the equivalent of a Globus, when what is needed is an Apollo Guidance Computer.

The DSKEY input module (right) shown alongside the Apollo Guidance Computer’s main casing (left). Source: Grabert at German Wikipedia., Public domain, via Wikimedia Commons

For instance, mechanical carburettors were replaced with mechanical fuel injection, and then with electronic fuel injection, to improve reliability, reduce cost, and enhance efficiency — all of which was realised — but was still fundamentally built on a tech heap that stretched back to the dawn of internal combustion, and replicating an existing function, but better.

Silicon, and computing power, can be “layered on” to improve what is already there, and modern cars and trucks have got computing power that would make the Apollo program blush. But as we grapple with the requirement to shift to more efficient, electrically-driven road vehicles, the optimisation of this energy shift requires a clean sheet design with completely integrated, flexible, and connected core computing and software which allows for leaps beyond what was possible with a mechanical system — a totally different paradigm requiring different design principles.

A key principle here is that silicon is cheap — and with the steady advancement of Moore’s Law, it is now almost laughable not to include embedded silicon wherever possible. The cost per transistor — even with recent supply chain issues — is almost so cheap as to be inconsequential.

But silicon in itself is only really the fulcrum point of the lever that is the software control.

And if all you’re doing is replacing mechanical components with electrical and electronic ones, then you’ll never be able to provide the meaningful movement of a long lever that is going to be necessary to grapple with the transition to a low-carbon economy.

And this speaks to the underlying paradigm shift required to accomplish this. There are a lot of industries still grappling with the transition to digital, and it requires a fundamental shift to a software-first approach and a software-first competency. If the mindset is still where electronics and software is secondary to the mechanical paradigm, where digital is regarded as a “bolt-on” or replacement, then this will never take advantage of the opportunities afforded by electric.

Ultimately, electrification isn’t merely about swapping a combustion engine out with an electric motor — this is the relatively easy part that both existing players and startups are doing. This approach will never be able to enable the benefits of electrification, except at a very superficial level. Switching to electric requires a wholesale change in everything, from drivetrain architectures to data networks to reshaping — in our case — what a truck really is. This is a fundamental change not only in technology, but also in core competencies, culture, and design principles.

The ascent stage of Apollo 17’s Lunar Module (LM). Source: NASA

To illustrate with another example from the aerospace world. The F-16 was the world’s first fly by wire, digital proportional control aircraft. It differed not just in how the inputs to the control stick caused the flight control surfaces to move — but in the fundamental design of the aircraft as being one that was inherently unstable. Like the Apollo LM, the computer flies the plane — the pilot inputs what he wants to happen, and the computers figure out how to make that happen. Today, it would be unthinkable to design any modern aircraft any other way. This is a design paradigm that, once the change took hold, ended up propagating and driving all other paradigms to extinction.

In the land of commercial vehicles, the old way of doing things is still the only way to do things. But when you try to make a dinosaur digital — at the end of the day, it is still a dinosaur.

¹ A software engineer for the AGC, Don Eyles, gives a fascinating insight into programming the AGC in his book Sunburst and Luminary, named for two of the programs that the flexible, powerful, and efficient AGC ran. ^

² The Globus is taken apart and investigated here. Look at all those gearwheels! ^

³ For instance, the Globus could only operate for a fixed orbital inclination and for circular orbits, rendering it useless for rendezvous and docking. It was only replaced in 2002! ^

⁴ Apollo 14 had a critical fault on the way to the moon. Demonstrating the power and flexibility of the AGC, the flight software was reprogrammed by the astronauts when they were already orbiting the Moon^

⁵ The continuing dominance of Moore’s Law has dramatically reduced the cost of transistors, increased the transistor count, and, through shrinking die size, massively reduced the energy cost per calculation. This is the main reason we have highly sophisticated and connected pocket computers now, less so because of improvements in battery technology. ^

⁶ To return to the comparison of the AGC and the Globus, the AGC deployed 17,000 transistors. The Globus had one. ^



2023/05/24 (updated 2023/05/24)

Lost in Transmission — Understanding the Advantages of Distributed Power and Software Control

In the dawn of the industrial era, powerplants were big, centrally located, and required a power distribution network of some kind to deliver power to specific consumers. To provide three examples of this:

  • In the late 1700s, Eli Whitney invented the cotton engine, the so-called “cotton gin”, a mechanical means of separating cotton. The power source for this was biological — human or animal effort rotated a shaft to extract cotton fibres from their seeds. The gin relied on a single source of motive power which was biological in its nature¹.
  • When the Watt steam engine was invented by James Watt in 1776² and gave rise to the first Industrial Revolution, human or animal labour was replaced with mechanical labour utilising the energy gained from burning coal³. The steam plant located outside a factory would drive a main shaft to rotate to drive machinery within the factory. Complex cams and other mechanical devices, transmitted the power to where it needed to be deployed.
The Boulton and Watt Rotative Beam Engine built by James Watt in 1788. The engine was used at Matthew Boulton’s Soho Manufactory in Birmingham, where it drove 43 metal polishing (or ‘lapping’) machines for 70 years.Science Museum Group. Rotative steam engine by Boulton and Watt, 1788. 1861–46Science Museum Group Collection Online. Accessed May 24, 2023. https://collection.sciencemuseumgroup.org.uk/objects/co50948/rotative-steam-engine-by-boulton-and-watt-1788-beam-engine-steam-engine.
The Boulton and Watt Rotative Beam Engine built by James Watt in 1788. The engine was used at Matthew Boulton’s Soho Manufactory in Birmingham, where it drove 43 metal polishing (or ‘lapping’) machines for 70 years. Image: Science Museum Group. Rotative steam engine by Boulton and Watt, 1788. 1861–46. Science Museum Group Collection Online. Accessed May 24, 2023.
  • At the dawn of the electric age, after it was realised that electric potential could be used to do useful work, large scale power stations were built by Edison and Westinghouse, burning coal or using the flow of a river to rotate electromagnetic machines to generate electricity. These large power stations would provide all of the power for a district or a city and the power was distributed via electrical conductors known as transmission cables.

All of these systems had something in common: a centralised source of power, which then had to transmit the energy along a distribution system, to where it would ultimately be expended.

This is a theme that we see commonly throughout technologies developed in the 18th/19th/20th centuries. The steam locomotive is exactly this: coal is burned on a vehicle to generate heat, the heat turns water to steam and the steam drives pistons via a mechanical transmission (the connecting rods) which are attached to and rotate the traction wheels. The same paradigm is dominant in passenger cars and commercial vehicles to this day: a large central powerplant converts chemical energy (from gasoline or diesel) to thermal energy which is then used to drive cylinders to turn a crankshaft (converting reciprocating motion to rotating motion), and then the crankshaft output mechanically drives a gearbox to a mechanical differential and then via a driveshaft the wheels.

The challenge with all of these centralised systems are the transmission losses in the system. Apart from the generation losses which are inherent in the power plant, transmission losses are unavoidable — and significant. The UK national energy grid, which is probably one of the best and most tightly integrated energy grids on the planet, operates with only about 25% energy efficiency, with the balance of the energy being lost to transmission losses (typically resistive losses on distribution cables and also some conversion losses as voltages are stepped up and down at various points). Imagine if you saw a gas tanker on the highway, that was throwing 75% of the fuel it was carrying out the back as it travelled down the highway! We’d consider it unacceptable. Yet somehow this kind of loss is acceptable in modern energy transmission systems.

This includes road vehicles and their internal power distribution system (mechanical driveline). In a road vehicle, the mechanical driveline losses can account for as much as 5% of the total energy lost in the vehicle; that is to say, 20–30% of the energy output of the engine is lost from rotating bits of metal. This is particularly pronounced in urban areas, with frequent start/stops and turns, where mechanical transmission components have to start and stop rotating frequently.

This centralised power paradigm is, however, being turned on its head. Take the example of a helicopter, which used a large central powerplant (or two) and a fiendishly complex mechanical arrangement to turn its rotor (or, in some cases, two rotors) and control the pitch and yaw of the rotorcraft. Despite fixed wing aircraft working with multiple jet turbines distributed along the wings, the helicopter stubbornly persisted with the centralised power paradigm, because precise, safe, and reliable mechanical control for multiple rotors was difficult; and engines are heavy, and bulky, and it is difficult to have many of them on a rotorcraft right on the edge of the performance envelope.

Then, almost overnight in the late 2000s, dirt-cheap powerful microcontrollers and inexpensive, small, and powerful electric motors gave rise to hobby drones, where all the control previously exerted by the mechanical control systems in a helicopter were now suddenly being managed by software control of motor speed, enabling low-cost, mass-produced, and simplified control drones to proliferate rapidly. Nowadays we are seeing an increasing number of passenger multirotor aircraft, using distributed electric propulsion, promising fast, cheap, safe, reliable, and highly automated air travel. Removing the requirement for a highly skilled pilot will hopefully improve the accessibility of this mode of transport that was previously used only by the extremely wealthy or for specialised purposes.

Likewise in the rail industry, the steam locomotive was superseded by the diesel and then the electric (or diesel-electric) locomotive; a centralised power unit pulling the train along. Nowadays pretty much every single light rail, and many commuter rail vehicles are what are known as Electric Multiple Units (EMUs) or Diesel Multiple Units (DMUs), which as the term suggested, distributes the motive source amongst multiple train cars.

Even in the field of the power grid, dispersed power systems becoming increasingly commonplace, as solar, wind, and micro hydro coupled with batteries means that energy can be generated locally, stored locally, and consumed locally, eliminating lengthy transmission losses often with a reduced capital burden.

Our fundamental approach to vehicle design espouses this distributed/dispersed power concept. When designing an electric vehicle, it is simple enough and low-risk to replace an internal combustion engine with an electric motor and a fuel tank with a large battery. However, there is a fundamental problem here: namely, batteries suck.

The best lithium battery in the world today has approximately 1/10th the energy density (by weight) of liquid hydrocarbons, and is considerably more bulky, expensive, and difficult to position than a gas tank. The entire modern economy has been built since the late 19th century on the abundance of cheap and readily-accessible energy in the form of liquid hydrocarbons (oil, gasoline, diesel, kerosene…). What this means is that to achieve the ranges required for road vehicles in a modern economy, a very large battery is required. This is heavy, and adds weight. This leads down a particularly vicious circle where more batteries are carried around in order to provide a reasonable range, and then additional batteries are required to propel the first set of batteries, etc. This reduces the efficiency of a vehicle, and results in very heavy cars and trucks (which in turn require more energy to move them…). In trucks of course, this has a direct impact on the available payload carrying capacity — lightweight trucks are hard.

There is another way to improve the available range given a certain size battery, and that is to minimise losses in the system. When one looks at a Sankey or waterfall diagram of energy loss in an automotive powertrain system, by far and away the majority of the loss is encapsulated within the internal combustion engine. OK, so by replacing this with an electric motor of some variety, a significant portion of the losses are eliminated. However, the part that is often forgotten or glossed over is that the mechanical driveline — gearbox, propeller shaft, differential, etc — consumes ~20% of the remaining energy. In the aggregate, driveline loss is relatively small — maybe 5% of the total loss in the system in a passenger car. However, this has a disproportionate impact in EVs.

US Department of Energy/EPA/Office of Energy Efficiency & Renewable Energy

Because of their poor energy density, batteries are extremely sensitive to even small losses. So the most simple way to eliminate this loss — is to eliminate as much of the mechanical driveline as possible, and to replace its critical functionality (primarily the differential allowing wheels turn at different rates to corner, etc) with electronics and software. So this gives rise to a system with four motors, and an electronic/software control of what was being handled by a bunch of meshing gears.

Now this sounds straightforward, and indeed there have been many demonstrators of such technology, but the trick — the secret sauce, if you like — is to do this reliably and safely. This is surprisingly hard, but with some smart people, some advanced mathematics, and a good safety culture it can be accomplished.

This allows the energy stored in the battery to be used for actually useful work — i.e., driving a further distance, rather than just spinning bits of metal. And this therefore means that the battery can be downsized — and here we then fall into a virtuous circle, since now the battery is smaller, the overall system is more efficient, which means in a practical sense, being able to carry more payload using less energy.

We must move away from the paradigm that has dominated since the 18th century — and instead define a new paradigm for the challenges of the 21st.

¹ There is a thought here which is possibly too tangential: but the creation of centralised power sources almost universally caused the disenfranchised to suffer. The invention and proliferation of the cotton gin is regarded as being one of the causes of the start of the US Civil War. ^

² The steam engine has had a very, very long (and fascinating!) period of invention with multiple contributors, sometimes in parallel, developing innovations in design, function, and purpose^

³ Steam plants in mills created awful conditions for the poor and dispossessed, either through poor living conditions (living next to a coal plant is unhealthy) but also direct physical harm when labourers (largely women and children) would get caught up in the mechanical transmission and get mangled, maimed, or killed. Interestingly enough the Bristol Tramway Company was created to alleviate this suffering. ^

Back to coal: over the course of the 19th and early 20th centuries, the western side of European and American cities were largely where the wealthy stayed, as prevailing winds blew the coal dust, ash, and smoke plume to the east — this has been studied as an economic phenomenon as well as a curious experiment in natural selection.

⁴ Dispersed power is actually an empowering technology. Unlike the old paradigm of centralised power, which lifted many out of energy scarcity but by and large punished the most disenfranchised and pushed significant negative externalities on everyone, dispersed power is a capital-efficient way to lift communities out of energy poverty, and improve safety (through lighting), economic prospects (through general availability of energy for purposes beyond subsistence), and connectivity (through powering communications devices such as smartphones and Internet terminals). ^

⁵ Interestingly, we see a similar parallel in telecommunications. Large, centrally switched phone systems — you would be physically connected to someone across the country! — being dispersed through cell towers, and now with megaconstellations such as Starlink utilising thousands of satellites, dispersed low-cost, high-bandwidth, communications systems. ^

⁶ In a slightly different context, rocket scientists call this the Tyranny of the Rocket Equation — the more fuel you have, the more fuel you need (i.e., fuel needs to be added to propel the fuel that is added to propel…) ^

⁷ Historical examples of multiple-motor drive include the 1900 Lohner-Porsche series hybrid and the Lunar Roving Vehicle (“Moon Buggy”) — the latter, of course, designed to be lightweight! ^



2023/02/14 (updated 2023/03/03)

One of the fundamental challenges facing society today is preservation of economic growth¹ while managing the trifecta of reducing greenhouse gas (GHG) emissions, outputting less pollution (especially in urban areas), and reducing the energy requirement of transport logistics (which is interrelated to both the aforementioned issues as well as economically²).

Why do we need to solve these interrelated but distinct and separate problems?

GHG emissions (carbon dioxide and CO2 equivalent emissions) are, essentially, causative to climate change, which is an existential threat, not necessarily to humanity in general (presumably some people will survive catastrophic climate change) but certainly to our way of life, standard of living, and the ability as a society to progress technologically. A point that is not mentioned nearly enough is that pretty much all easily accessible raw materials such as coal, iron, oil, etc, have been extracted from the earth’s crust, and therefore if we lose access to the technology stack humanity has developed now because of a widescale civilisation destabilisation, it will be almost impossible for descendants or successors to develop the same tech stack — complex societies on this planet will not be able to emerge for a very, very long time. This is a serious problem!

Pollution, as distinct from emissions, tends to be a more localised issue which refers to a wide variety of generally local outputs. There are a few different types of pollution, but in general, we are referring to the issues of poor air quality through localised outputs including metallic and plastic particulate matter, carbon black, and gases such as NOx, and NO₂.

Particulate matter — PM — is a particularly pernicious issue with links to respiratory and cardiovascular disease, but is also being increasingly correlated with Parkinsons, Alzheimers, dementia, and other ailments. Human physiology has not really evolved to deal with extremely fine particulates (particularly those of 2.5 micron diameter or smaller, referred to as PM2.5) and these very fine particulates, once inhaled, are tiny enough to pass through the alveoli of our lungs into our bloodstream where they are small enough to transfer through the blood-brain barrier where they lodge in the delicate neural structures of the brain.

PMs can be emitted as a byproduct of internal combustion, but also through vehicular travel and braking, specifically emissions of tyre wear and brake dust.

Pollution also refers to localised gaseous outputs of gases such as NOx or NO₂ and others which create smog and are strongly linked to respiratory disease and increased mortality — up to 6.5 million deaths per year are attributable to poor air quality!

Reduction of the required energy to move cubes may seem unrelated to the two preceding issues. But, with generally increased and more intensive transport activity associated with increased penetration of particularly e-commerce, and also an increased level of convenience when it comes to availability of produce and materials, next-day or even same-day delivery is not going away³. Therefore what is necessary to continue to enable this growth is a reduction of energy requirement per unit of goods moved over the longer term to:

  1. Diminish the output of GHG and pollution associated with transport logistics, and,
  2. To reduce costs of transport logistics (which is good for economic growth).

The simplest way to reduce the energy requirement — which is often dependent on external, and sometimes unpredictable, factors — is to merely use less energy. This seems like a tautology, but it is amazing how many Rube-Goldbergian type solutions try to claim some kind of reduction of energy use through a very complex system. At the end of the day, physics will always win. Efficiency gains can be realised, however, through the development of straightforward solutions to maximise efficiency through application of sophisticated, advanced controls.

Where we do these things matters as well, since we want to get the most bang for our buck, so to speak.

GHG emissions are as previously mentioned, a global phenomenon — GHG emitted in California will, broadly speaking, disproportionately affect those in the so-called “Global South”. Reducing GHG emissions through the reduction of energy requirement therefore has a net positive no matter where it is deployed; and so, it makes sense to deploy what are generally more expensive GHG emissions reducing technologies where per capita GHG emissions are the most intensive, which tends to be locales with stronger economies.

Pollution is an altogether more local problem. As it turns out, built-up and urban areas are by far and away the most sensitive to pollution, but — ironically and unfortunately — these urban conurbations are the most prone to pollution due to the duty cycle requirements of those geographies (lots of start/stop, frequent tight cornering, lower average speeds tend to prevent the dispersal of pollution, buildings channel smog and particulate matter…).

Photo by Kristen Morith on Unsplash

This has a drastic impact on those living and working in these locales, who often tend to be poorer with fewer economic and social mobility prospects; and also tends to, again, have a disproportionate impact on the most vulnerable in the population — another social injustice brought about by the ready energy utility of liquid hydrocarbons and displaced externalities.

Diesel is worse here than gasoline, which is worse than electric-only operations. Diesel fuel has a relatively incomplete combustion which emits soot (carbon black); NOx, NO₂, and others which create smog; but also metallic PM2.5 which as previously described the human physiology is ill-adapted to filtering out. Gasoline is a bit better in general, with something of a cleaner combustion, but still outputs significant levels of pollution. In urban areas, therefore, a zero-tailpipe emissions solution is preferable.

So: technologies should be deployed where they will have the most meaningful impact. Technology for the sake of technology is, generally, unhelpful, and the concept of low- or zero-emission long haul trucks is all well and good when it comes to reducing GHG emissions, but since the pollution is dispersed along relatively long distances on these long highway legs, there is less of an issue outside of urban areas. These long distance legs often start and end in urban or urbanised areas, however, and the urban/semi-urban delivery segment is the most challenging and difficult to solve for duty cycle; but solving this difficult problem will have the most significant impact on improvement in air quality, directly saving lives both in the short term but also long term.

What is required then, is the best option that will enable massive reductions of GHG and PM today, and will apply those reductions in the short term, and where it matters most. Because of the rapidity of anthropogenic warming, solutions need to be deployed urgently, and therefore cannot wait for to-be-developed-and-paid-for large-scale electrical generation, distribution, and charging infrastructure; the problem cannot continue to be kicked down the line.

Operators also need meaningful pull factors to encourage adoption of effective solutions; push factors such as governmental legislation can be highly effective, but often these are equally highly politicised and so cannot be relied on to deliver the meaningful results that are necessary. So solutions must be economically viable, environmentally meaningful, able to slot into existing operations, otherwise there will never be takeup on the scales and timescales needed to make a difference. While many large organisations are paying for semi-effective solutions out of ESG budgets these are not long-term viable or largescale, and the problems are such that there is widescale adoption today.

We need to provide the best solution that is able to be deployed at scale today, which will provide the most impactful outcomes in the geographies that are most vulnerable, and that will encourage zero-emissions takeup and give all logistics providers the ability to deploy ZE at scale in short order.

We at Bristol Superlight have developed such a solution, and proven its operation over the last few years with some of the most demanding customers, in some of the most demanding applications. Our vehicle solution is lightweight, sophisticated, utilising highly advanced controls, and able to deliver meaningful zero-emissions operations in urban areas, while slotting into logistics carriers day-to-day operations seamlessly. It is a solution which is real, and which works, and has covered thousands of kilometres on the road in revenue service; which is economically viable both from a capital perspective but also from an operational cost consideration, with significant reductions in cost per pallet-km when compared to not only conventional diesel trucks, but also to other electric trucks. We have chosen not to electrify a truck, but to redefine it. And we have the evidence to back it up. It is the culmination of years of work and effort — and this is just the start.

Photo by Kane Taylor on Unsplash

¹ Economic growth for the sake of economic growth is of questionable merit. However, some of the world has reached a certain quality of life that they would prefer not to see diminished; and the rest of the world would like to reach at least that same level of quality of life. So, economic growth. Whether this path is advisable or healthy is questionable to say the least, and is a whole other problem; nevertheless, we all exist in the real world, and human behaviour is, as the evidence indicates, unlikely to drastically change. ^

² It is a truism that, generally, economic growth (and more specifically, prosperity), is linked to reduction of transport costs. By enabling the increasingly cheap movement of people and goods, more people have been lifted out of poverty than ever before. The paradox, and corollary, statement is that the increasingly cheap movement of people and goods has resulted in greater externalities in terms of GHG emissions (and the resultant climate change) and in terms of pollution (primarily, but not exclusively, around air quality) which disproportionately impact geographies with a lower level of economic development. ^

³ To the contrary, this is in some ways an (economic) more efficient allocation of labour — as Jeff Wilke, former CEO of Worldwide Consumer at Amazon.com says, “delivery in some sense replaces your labor, which was unpaid, to get in your car and drive somewhere to buy something and come back.” ^

⁴ The humble pallet comes to the fore here as a near-universal footprint for easy mechanical transfer of goods; it is as ubiquitous and as game-changing as its larger cousin, the 20-foot equivalent unit (TEU) shipping container. The pallet is sometimes referred to as a cube, referring to the pallet + payload volume on the pallet. ^

⁵ Complexity is not the same as sophistication. ^

⁶ Consider the tragic case of the child Ella Kissi-Debrah, who has the unfortunate distinction of the first death directly attributed to poor air quality. ^

⁷ Though, it has to be noted that workers in warehouses servicing these diesel trucks are suing their employers because of their direct, prolonged exposure to pollution caused by diesel vehicles. ^

⁸ Not just charging infrastructure — the International Energy Agency has determined that the transition to electrified vehicles requires roughly 50 more lithium mines, 60 more nickel mines, and 17 more cobalt mines just to meet the 2030 EV projections. Going full on long-range BEV for every single commercial vehicle is just not feasible in the near to medium term. ^



2023/02/28

Bristol Superlight (BSL) announced today that it has moved into its new production facility and headquarters in Yateley, Hampshire.

Bristol Superlight