Repeated exposure to major disasters does not make people mentally stronger, a recent study found: individuals who have been repeatedly exposed to major disasters show a reduction in mental health scores.
The Illinois Sustainable Technology Center (ISTC) will begin cultivating spirulina for the animal feed market in a $2.5 million, three-year project. It aims to demonstrate that large-scale algae production can be both cost-effective and environmentally friendly. Construction of algal ponds is now underway and should be completed by the middle of 2022.
This initiative sits in a wider $67 million US carbon capture research project funded by the US Department of Energy. It will pilot-test a carbon dioxide capture system designed to process flue gas from the City Water, Light and Power plant in Springfield, Illinois. The carbon captured by this technology will be fed into the algal ponds. The algae will also be reared on nutrient-rich wastewater from local treatment plants.
Cultivation methods that use industrial by-products serve agricultural, waste treatment, and carbon capture purposes simultaneously, rendering it more cost-effective.
People interact with machines in countless ways every day. In some cases, they actively control a device, like driving a car or using an app on a smartphone. Sometimes people passively interact with a device, like being imaged by an MRI machine. And sometimes they interact with machines without consent or even knowing about the interaction, like being scanned by a law enforcement facial recognition system.
Human-Machine Interaction (HMI) is an umbrella term that describes the ways people interact with machines. HMI is a key aspect of researching, designing and building new technologies, and also studying how people use and are affected by technologies.
Researchers, especially those traditionally trained in engineering, are increasingly taking a human-centered approach when developing systems and devices. This means striving to make technology that works as expected for the people who will use it by taking into account what’s known about the people and by testing the technology with them. But even as engineering researchers increasingly prioritize these considerations, some in the field have a blind spot: diversity.
Researchers and developers typically follow a design process that involves testing key functions and features before releasing products to the public. Done properly, these tests can be a key component of compassionate design. The tests can include interviews and experiments with groups of people who stand in for the public.
In academic settings, for example, the majority of study participants are students. Some researchers attempt to recruit off-campus participants, but these communities are often similar to the university population. Coffee shops and other locally owned businesses, for example, may allow flyers to be posted in their establishments. However, the clientele of these establishments is often students, faculty and academic staff.
In many industries, co-workers serve as test participants for early-stage work because it is convenient to recruit from within a company. It takes effort to bring in outside participants, and when they are used, they often reflect the majority population. Therefore, many of the people who participate in these studies have similar demographic characteristics.
It is possible to use a homogenous sample of people in publishing a research paper that adds to a field’s body of knowledge. And some researchers who conduct studies this way acknowledge the limitations of homogenous study populations. However, when it comes to developing systems that rely on algorithms, such oversights can cause real-world problems. Algorithms are as only as good as the data that is used to build them.
Algorithms are often based on mathematical models that capture patterns and then inform a computer about those patterns to perform a given task. Imagine an algorithm designed to detect when colors appear on a clear surface. If the set of images used to train that algorithm consists of mostly shades of red, the algorithm might not detect when a shade of blue or yellow is present.
Even as the U.S. fights COVID-19, the lack of diverse training data has become evident in medical devices. Pulse oximeters, which are essential for keeping track of your health at home and to indicate when you might need hospitalization, may be less accurate for people with melanated skin. These design flaws, like those in algorithms, are not inherent to the device but can be traced back to the technology being designed and tested using populations that were not diverse enough to represent all potential users.
Researchers in academia are often under pressure to publish research findings as quickly as possible. Therefore, reliance on convenience samples – that is, people who are easy to reach and get data from – is very common.
Though institutional review boards exist to ensure that study participants’ rights are protected and that researchers follow proper ethics in their work, they don’t have the responsibility to dictate to researchers who they should recruit. When researchers are pressed for time, considering different populations for study subjects can mean additional delay. Finally, some researchers may simply be unaware of how to adequately diversify their study’s subjects.
There are several ways researchers in academia and industry can increase the diversity of their study participant pools.
One is to make time to do the inconvenient and sometimes hard work of developing inclusive recruitment strategies. This can require creative thinking. One such method is to recruit diverse students who can serve as ambassadors to diverse communities. The students can gain research experience while also serving as a bridge between their communities and researchers.
Another is to allow members of the community to participate in the research and provide consent for new and unfamiliar technologies whenever possible. For example, research teams can form an advisory board composed of members from various communities. Some fields frequently include an advisory board as part of their government-funded research plans.
Another approach is to include people who know how to think through cultural implications of technologies as members of the research team. For instance, the New York City Police Department’s use of a robotic dog in Brooklyn, Queens and the Bronx sparked outrage among residents. This might have been avoided if they had engaged with experts in the social sciences or science and technology studies, or simply consulted with community leaders.
Lastly, diversity is not just about race but also age, gender identity, cultural backgrounds, educational levels, disability, English proficiency and even socioeconomic levels. Lyft is on a mission to deploy robotaxis next year, and experts are excited about the prospects of using robotaxis to transport the elderly and disabled. It is not clear whether these aspirations include those who live in less-affluent or low-income communities, or lack the family support that could help prepare people to use the service. Before dispatching a robotaxi to transport grandmothers, it’s important to take into account how a diverse range of people will experience the technology.
Two centuries of burning fossil fuels has put more carbon dioxide, a powerful greenhouse gas, into the atmosphere than nature can remove. As that CO2 builds up, it traps excess heat near Earth’s surface, causing global warming. There is so much CO2 in the atmosphere now that most scenarios show ending emissions alone won’t be enough to stabilize the climate – humanity will also have to remove CO2 from the air.
The U.S. Department of Energy has a new goal to scale up direct air capture, a technology that uses chemical reactions to capture CO2 from air. While federal funding for carbon capture often draws criticism because some people see it as an excuse for fossil fuel use to continue, carbon removal in some form will likely still be necessary, IPCC reports show. Technology to remove carbon mechanically is in development and operating at a very small scale, in part because current methods are prohibitively expensive and energy intensive. But new techniques are being tested this year that could help lower the energy demand and cost.
We asked Arizona State University Professor Klaus Lackner, a pioneer in direct air capture and carbon storage, about the state of the technology and where it’s headed.
What is direct carbon removal and why is it considered necessary?
Humanity can’t afford to have increasing amounts of excess carbon floating around in the environment, so we have to get it back out.
Not all emissions are from large sources, like power plants or factories, where we can capture CO2 as it comes out. So we need to deal with the other half of emissions – from cars, planes, taking a hot shower while your gas furnace is putting out CO2. That means pulling CO2 out of the air.
Since CO2 mixes quickly in the air, it doesn’t matter where in the world the CO2 is removed – the removal has the same impact. So we can place direct air capture technology right where we plan to use or store the CO2.
The method of storage is also important. Storing CO2 for just 60 years or 100 years isn’t good enough. If 100 years from now all that carbon is back in the environment, all we did was take care of ourselves, and our grandkids have to figure it out again. In the meantime, the world’s energy consumption is growing at about 2% per year.
One of the complaints about direct air capture, in addition to the cost, is that it’s energy intensive. Can that energy use be reduced?
Two large energy uses in direct air capture are running fans to draw in air and then heating to extract the CO2. There are ways to reduce energy demand for both.
For example, we stumbled into a material that attracts CO2 when it’s dry and releases it when wet. We realized we could expose that material to wind and it would load up with CO2. Then we could make it wet and it would release the CO2 in a way that requires far less energy than other systems. Adding heat created from renewable energy raises the CO2 pressure even higher, so we have a CO2 gas mixed with water vapor from which we can collect pure CO2.
We can save even more energy if the capture is passive – it isn’t necessary to have fans blowing the air around; the air moves on its own.
My lab is creating a method to do this, called mechanical trees. They’re tall vertical columns of discs coated with a chemical resin, about 5 feet in diameter, with the discs about 2 inches apart, like a stack of records. As the air blows through, the surfaces of the discs absorb CO2. After 20 minutes or so, the discs are full, and they sink into a barrel below. We send in water and steam to release the CO2 into a closed environment, and now we have a low-pressure mixture of water vapor and CO2. We can recover most of the heat that went into heating up the box, so the amount of energy needed for heating is quite small.
By using moisture, we can avoid about half the energy consumption and use renewable energy for the rest. This does require water and dry air, so it won’t be ideal everywhere, but there are also other methods.
Can CO2 be safely stored long term, and is there enough of that type of storage?
I started working on the concept of mineral sequestration in the 1990s, leading a group at Los Alamos. The world can actually put CO2 away permanently by taking advantage of the fact that it’s an acid and certain rocks are base. When CO2 reacts with minerals that are rich in calcium, it forms solid carbonates. By mineralizing the CO2 like this, we can store a nearly unlimited amount of carbon permanently.
For example, there’s lots of basalt – volcanic rock – in Iceland that reacts with CO2 and turns it into solid carbonates within a few months. Iceland could sell certificates of carbon sequestration to the rest of the world because it puts CO2 away for the rest of the world.
There are also huge underground reservoirs from oil production in the Permian Basin in Texas. There are large saline aquifers. In the North Sea, a kilometer below the ocean floor, the energy company Equinor has been capturing CO2 from a gas processing plant and storing a million tons of CO2 a year since 1996, avoiding Norway’s tax on CO2 releases. The amount of underground storage where we can do mineral sequestration is far larger than we will ever need for CO2. The question is how much can be converted into proven reserve.
We can also use direct air capture to close the carbon loop – meaning CO2 is reused, captured and reused again to avoid producing more. Right now, people use carbon from fossil fuels to extract energy. You can convert CO2 to synthetic fuels – gasoline, diesel or kerosene – that have no new carbon in them by mixing the captured CO2 with green hydrogen created with renewable energy. That fuel can easily ship through existing pipelines and be stored for years, so you can produce heat and electricity in Boston on a winter night using energy that was collected as sunshine in West Texas last summer. A tankful of “synfuel” doesn’t cost much, and it’s more cost-effective than a battery.
The Department of Energy set a new goal to slash the costs of carbon dioxide removal to US$100 per ton and quickly scale it up within a decade. What has to happen to make that a reality?
DOE is scaring me because they make it sound like the technology is already ready. After neglecting the technology for 30 years, we can’t just say there are companies who know how to do it and all we have to do is push it along. We have to assume this is a nascent technology.
Climeworks is the largest company doing direct capture commercially, and it sells CO2 at around $500 to $1,000 per ton. That’s too expensive. On the other hand, at $50 per ton, the world could do it. I think we can get there.
The U.S. consumes about 7 million tons of CO2 a year in merchant CO2 – think fizzy drinks, fire extinguishers, grain silos use it to control grain powder, which is an explosion hazard. The average price is $60-$150. So below $100 you have a market.
What you really need is a regulatory framework that says we demand CO2 is put away, and then the market will move from capturing kilotons of CO2 today to capturing gigatons of CO2.
Where do you see this technology going in 10 years?
I see a world that abandons fossil fuels, probably gradually, but has a mandate to capture and store all the CO2 long term.
Our recommendation is when carbon comes out of the ground, it should be matched with an equal removal. If you produce 1 ton of carbon associated with coal, oil or gas, you need to put 1 ton away. It doesn’t have to be the same ton, but there has to be a certificate of sequestration that assures it has been put away, and it has to last more than 100 years. If all carbon is certified from the moment it comes out of the ground, it’s harder to cheat the system.
A big unknown is how hard industry and society will push to become carbon neutral. It’s encouraging to see companies like Microsoft and Stripe buying carbon credits and certificates to remove CO2 and willing to pay fairly high prices.
New technology can take a decade or two to penetrate, but if the economic pull is there, things can go fast. The first commercial jet was available in 1951. By 1965 they were ubiquitous.
The Compost Research and Education Foundation (CREF) has released fact sheets on 10 compost end uses. Each sheet describes the application, highlights the key benefits and return on investment (ROI) for the end user, reviews construction and/or application requirements, and includes a “spec sheet” of the compost parameters for that application, e.g., pH, organic matter content, stability, particle size.
Songs Of Disappearance is an entire album of calls from endangered Australian birds. Last month, it briefly perched at No. 3 on the country’s top 50 albums chart – ahead of Taylor Swift.
Anthony Albrecht produced the album with his arts organization, the Bowerbird Collective. He’s a musician and a Ph.D. candidate at Charles Darwin University, where his adviser is professor Stephen Garnett.
Conventional soybean oil is not common for making laundry surfactants since the long chain length of its fatty acids leads to molecules that lack the physical properties necessary for stain removal. Epoxides of conventional soybean oil can further lead to rapid side reactions causing undesirable by-products.
Epoxidized high-oleic soybean oil (HOSO), however, offers the flexible chemistry necessary to adjust functional groups, bio-based content, and hydrophilic-lipophilic balance.
Researchers claim they have developed 49 HOSO surfactant candidates, including cationic, anionic, nonionic, and amphoteric, with up to 100% biobased content that are stable over a range of pHs.
New HOSO surfactants are poised to launch in a range of business areas.
Already, an increasing number of designers and startups across the globe are leading what Van Dongen calls the “solar movement.”As a result, solar applications are growing more and more diverse: a company in the U.S. has developed solar cells that can be integrated into windows. Another has transformed dreary solar panels into patterned masterpieces by redesigning how the silver lines look on the panels. Elsewhere, designers are creating colored glass tables that can absorb energy from daylight and charge your devices, clothes that can charge your phones, and textile roofs that can stretch over buildings while harnessing energy from the sun. As solar energy becomes more affordable, the options are increasing, and 2022 may well become a banner year for solar energy.