under the sea code monkey
609 stories

Post-Vaccine Party

2 Comments and 4 Shares
[Future update] Well, someone accidentally dropped an M&M in their cup of ice water, and we all panicked and scattered.
Read the whole story
628 days ago
sometimes I watch Bob Ross on twitch (they stream him on the weekends) to hear the paint brush sounds
628 days ago
There is a Bob Ross station on Plex Live TV that shows episodes pretty much around the clock from what I can tell. Might be connected to the Twitch stream, so perhaps it is only weekends.
627 days ago
Youtube has all his videos! Also, if you're into it for the brush sounds, you should look into ASMR.
Share this story
1 public comment
628 days ago
[Future update] Well, someone accidentally dropped an M&M in their cup of ice water, and we all panicked and scattered.

Cetacean Surroundsound

1 Share

I was thinking about this whale song bunker idea the other week after reading about the potential for whale song to be used as a form of deep-sea seismic sensing. That original project—with no actual connection to the following news story—proposed using a derelict submarine surveillance station on the coast of Scotland as a site for eavesdropping on the songs of whales.

[Image: An otherwise unrelated image of whales, courtesy Public Domain Review.]

In a paper published in Science last month, researchers found that “fin whale songs can also be used as a seismic source for determining crustal structure. Fin whale vocalizations can be as loud as large ships and occur at frequencies useful for traveling through the ocean floor. These properties allow fin whale songs to be used for mapping out the density of ocean crust, a vital part of exploring the seafloor.”

The team noticed not only that these whale songs could be picked up on deep-sea seismometers, but that “the song recordings also contain signals reflected and refracted from crustal interfaces beneath the stations.” It could be a comic book: marine geologists teaming up with animal familiars to map undiscovered faults through tectonic sound recordings of the sea.

There’s something incredibly beautiful about the prospect of fin whales swimming around together through the darkness of the sea, following geological structures, perhaps clued in to emerging tectonic features—giant, immersive ambient soundscapes—playfully enjoying the distorted reflections of each other’s songs as they echo back off buried mineral forms in the mud below.

I’m reminded of seemingly prescient lyrics from Coil’s song “The Sea Priestess”: “I was woken three times in the night / and asked to watch whales listen for earthquakes in the sea / I had never seen such a strange sight before.”

Someday, perhaps, long after the pandemic has passed, we’ll gather together in derelict bunkers on the ocean shore to tune into the sounds of whales mapping submerged faults, a cross-species geological survey in which songs serve as seismic media.

Read the whole story
634 days ago
Share this story

Republicans want to make it harder to pass ballot initiatives. That should alarm us | David Daley

1 Share

State legislators are trying to make it more difficult for citizens to take action when their own representatives won’t

They walked through Michigan college football games dressed as gerrymandered districts. They crisscrossed Idaho in a decades-old RV dubbed the Medicaid Express. In Florida, they united black and white, left and right, Trump-loving “deplorables” and radical criminal justice reformers into a mighty moral movement to end an ugly vestige of Jim Crow.

Related: The Guardian view on the return of Donald Trump: plotting a hostile takeover | Editorial

Continue reading...
Read the whole story
641 days ago
Share this story

Season Six, Episode 19: The Unnatural

1 Share

Season Six, Episode 19: The Unnatural

The Homestead Grays jersey I’m wearing in this strip is real; I wear it in triple homage to the Negro Leagues, my hometown of Pittsburgh, and this X-Files episode. I got it from Steel City Cotton Works, which seems to be out of the jersey but still sells a T-shirt version. This episode introduces Arthur […]
Read the whole story
664 days ago
Share this story

Octopuses Find New Hunting Buddies - Issue 95: Escape

1 Share

Under a coral ledge, a day octopus and a brown-marbled grouper meet. The grouper was there first, as if waiting, and they emerge together; the octopus zips ahead, skin turning from deep scarlet to purple-blue, and arrives at a boulder. Beneath the boulder is a hollow, a perfect hiding place for small prey. The octopus’s skin flickers and momentarily resembles the grouper’s own mottled scales before reverting to blue. They converge side-by-side in front of the hollow, seeming to consider it.

Again the octopus’ skin mottles, and then again, the second time not much resembling the grouper’s patterns but rather a constellation of lights. The octopus swims to the boulder’s far side and makes a barrier with tentacles now spread flat and turned white. The grouper dives into the hollow, looking for fish whose escape routes have been blocked. Never mind the 500-million-year taxonomic gulf that separates them. Hunger, opportunity, and smarts cross that divide easily, hinting at adaptations that could help these unlikely partners navigate lean years ahead and providing an interspecies union to tickle human imaginations.

“Crossing a phylum gap like this is almost like a human talking to a snail,” says Daniel Bayley, a marine biologist at University College London. “It’s amazing to watch.” Bayley and Amelia Rose of Oxford University observed octopuses working with brown-marbled grouper—and with peacock grouper and gold-saddle goatfish, too—in sunlit shallows off atolls in the northern Chagos Archipelago, a remote chain of islands in the Indian Ocean. The incidents caught their attention: While multispecies associations are not uncommon in the animal kingdom, it’s unusual to see such them between such different creatures. What’s more, this was a true collaboration.

Daniel Bayley

The animals were not merely taking advantage of the other’s activities, as happens when fish follow turtles across the seafloor, snapping up critters scattered by their sediment-disturbing foraging. They actively communicated with one another—not unlike the coyote and badger who starred in a viral video last year, with the coyote making a distinctive canine play bow to invite the badger along. There’s a lot of cognitive sophistication packed into such interactions.

That octopuses would be involved is likely no surprise for people attuned to a cephalopod-celebrating zeitgeist inspired by books like Sy Montgomery’s best-selling The Soul of an Octopus and the wildly popular documentary My Octopus Teacher, which showed marvelous behaviors that had yet to be described scientifically: an octopus camouflaging herself with seashells, or playing with schools of fish. Popular footage likewise preceded description with octopus-grouper unions featured in the Blue Planet II television series and in a handful of YouTube videos. In the Chagos Archipelago, though, Bayley and Rose had the chance to observe the collaborations in detail.

If you think about it, that’s like us speaking to fish.

As they recount in a study published in the journal Marine and Freshwater Behaviour and Physiology, the researchers even started to decipher some of the signals. When octopuses preceded the hunt with what Bayley and Rose called a “pounce” gesture, engulfing corals with their mantle and turning white, it was their fish partners who ate. When instead octopuses made a “groping” gesture, inserting tentacles into a crack or crevice, it was their turn to dine. “It was very much a give-and-take scenario,” Bayley said. Other gestures and skin-pattern signals remain uninterpreted. (Bayley and Rose did not observe octopuses “punching” their partners, as another team of researchers recently documented in the Red Sea.)

Asked to review the footage, Culum Brown, a behavioral ecologist at Macquarie University who specializes in fish intelligence, said “they are clearly communicating with one another.” Brown, who agrees that the hunts are planned, is curious how about how they’re initiated. Do octopuses give the go-ahead? Or fish? And how are partners chosen? Brown pointed to research on groupers cooperating with moray eels to hunt. Those fish select the most competent eels, demonstrating a collaborative sophistication once thought limited to humans and certain primates. He also noted that, while octopuses seemed to take the lead, the collaborations seen by Bayley and Rose demand quite a bit of intelligence from the fish as well.

Bayley’s own specialty is coral reef ecology, and his curiosity is piqued by what the cooperation represents. Though scientists have studied the Chagos Archipelago for decades, he said, no record exists of octopuses and fish hunting together there. He suspects it’s a recent adaptation. In the last several years, climate change-induced coral die-offs have radically disrupted the reefs’ animal populations and changed their very topography. There is much less prey than before; fields of rubble are not ideal for ambush-style hunting tactics that once worked so well.

“This could be driving this novel interaction where both fish and octopus are searching for food, but their individual hunting strategies are not best for this flattened environment that’s losing a lot of its structure,” Bayley explained. Partnerships with gold-saddle goatfish are also notable in that they are adapted to find prey buried in sediment. “It’s another skill set to bring in. It’s almost like the octopus is assembling a team with specialist skills,” said Bayley.

Another fascinating question, he said, is how knowledge spreads of hunting strategies and mutually understood signals. Perhaps each participating octopus and fish works it out for themselves, through trial and error—or perhaps, following an initial breakthrough, knowledge spreads by observation or even active teaching. That would make it a cultural adaptation, a matter of accumulating knowledge passed between generations, an example of species surviving in a fast-changing world not because of some fortunate genetic mutation but because they are learning. This also suggests a potentially key role for grouper. Unlike day octopuses, who reach a ripe old age at 15 months, brown-marbled and peacock grouper can easily live 40 years or longer. They may be a living library for their short-lived partners.

That the Chagos Archipelago is so remote and well-protected—commercial fishing is banned there—may also help these underwater cultures evolve and proliferate, Bayley said. Individuals who would otherwise spread their knowledge don’t end up on a dinner plate. Brown agreed, noting an emerging scientific literature on the importance of animal culture to conservation and also the hypothesis that a loss of culture has hindered the recovery of some populations from overfishing. “One of the explanations is that we’ve basically fished out the cultural information about where to forage, where to migrate, where the best breeding grounds are,” he said. “If you’re consistently removing the biggest, the oldest and wisest, from the population, then that information is lost.”

That understanding adds new importance to halting overfishing, and also to curbing pollution and excessive underwater noise that can impair learning, in other parts of the ocean—and maybe the sheer wonder of watching octopuses and fish work together can help inspire people to take action. Brown, echoing Bayley, marveled at the cross-phylum communication. “If you think about it, that’s like us speaking to fish,” said Brown, who could not think of an example of humans communicating in such a sophisticated manner with either fish or cephalopods. The closest analogue is people and their dogs. “It’s pretty astonishing,” he said.

Brandon Keim (Twitter / Instagram) is a science and nature journalist. He is presently working on Meet the Neighbors, a book about what understanding the minds of animals means for human relations to wild animals and to nature.

Research by Daniel Bayley was funded by the Bertarelli Foundation. You can find out more about its marine science program here.

Lead art: ezumer / Shutterstock

This article was originally published on our  Oceans Channel, in January 2021.

Read More…
Read the whole story
669 days ago
Share this story

Artificial intelligence model detects asymptomatic Covid-19 infections through cellphone-recorded coughs

[Via Dewayne Hendricks <dewayne@warpspeed.com> and
  "David J. Farber" <farber@gmail.com>]

Jennifer Chu, MIT News Office, 29 Oct 2020

"The team is working on incorporating the model into a user-friendly app,
which if FDA-approved and adopted on a large scale could potentially be a
free, convenient, noninvasive prescreening tool to identify people who are
likely to be asymptomatic for Covid-19. A user could log in daily, cough
into their phone, and instantly get information on whether they might be
infected and therefore should confirm with a formal test."


Artificial intelligence model detects asymptomatic Covid-19 infections
through cellphone-recorded coughs

Asymptomatic people who are infected with Covid-19 exhibit, by definition,
no discernible physical symptoms of the disease. They are thus less likely
to seek out testing for the virus, and could unknowingly spread the
infection to others.

But it seems those who are asymptomatic may not be entirely free of changes
wrought by the virus. MIT researchers have now found that people who are
asymptomatic may differ from healthy individuals in the way that they
cough. These differences are not decipherable to the human ear. But it turns
out that they can be picked up by artificial intelligence.

In a paper published recently in the IEEE Journal of Engineering in Medicine
and Biology, the team reports on an AI model that distinguishes asymptomatic
people from healthy individuals through forced-cough recordings, which
people voluntarily submitted through web browsers and devices such as
cellphones and laptops.

The researchers trained the model on tens of thousands of samples of coughs,
as well as spoken words. When they fed the model new cough recordings, it
accurately identified 98.5 percent of coughs from people who were confirmed
to have Covid-19, including 100 percent of coughs from asymptomatics—who
reported they did not have symptoms but had tested positive for the virus.

The team is working on incorporating the model into a user-friendly app,
which if FDA-approved and adopted on a large scale could potentially be a
free, convenient, noninvasive prescreening tool to identify people who are
likely to be asymptomatic for Covid-19. A user could log in daily, cough
into their phone, and instantly get information on whether they might be
infected and therefore should confirm with a formal test.

“The effective implementation of this group diagnostic tool could diminish
the spread of the pandemic if everyone uses it before going to a classroom,
a factory, or a restaurant,'' says co-author Brian Subirana, a research
scientist in MIT's Auto-ID Laboratory.  Subirana's co-authors are Jordi
Laguarta and Ferran Hueto, of MIT's Auto-ID Laboratory.

New AI model detects asymptomatic Covid-19 infections through
device-recorded coughs Vocal sentiments

Prior to the pandemic's onset, research groups already had been training
algorithms on cellphone recordings of coughs to accurately diagnose
conditions such as pneumonia and asthma. In similar fashion, the MIT team
was developing AI models to analyze forced-cough recordings to see if they
could detect signs of Alzheimer's, a disease associated with not only memory
decline but also neuromuscular degradation such as weakened vocal cords.

They first trained a general machine-learning algorithm, or neural network,
known as ResNet50, to discriminate sounds associated with different degrees
of vocal cord strength. Studies have shown that the quality of the sound
*mmmm* can be an indication of how weak or strong a person's vocal cords
are. Subirana trained the neural network on an audiobook dataset with more
than 1,000 hours of speech, to pick out the word *them* from other words
like *the* and *then*.

The team trained a second neural network to distinguish emotional states
evident in speech, because Alzheimer's patients—and people with
neurological decline more generally—have been shown to display certain
sentiments such as frustration, or having a flat affect, more frequently
than they express happiness or calm. The researchers developed a sentiment
speech classifier model by training it on a large dataset of actors
intonating emotional states, such as neutral, calm, happy, and sad.

The researchers then trained a third neural network on a database of coughs in order to discern changes in lung and respiratory performance.

Finally, the team combined all three models, and overlaid an algorithm to
detect muscular degradation. The algorithm does so by essentially simulating
an audio mask, or layer of noise, and distinguishing strong coughs—those
that can be heard over the noise—over weaker ones.

With their new AI framework, the team fed in audio recordings, including of
Alzheimer's patients, and found it could identify the Alzheimer's samples
better than existing models. The results showed that, together, vocal cord
strength, sentiment, lung and respiratory performance, and muscular
degradation were effective biomarkers for diagnosing the disease.

When the coronavirus pandemic began to unfold, Subirana wondered whether
their AI framework for Alzheimer's might also work for diagnosing Covid-19,
as there was growing evidence that infected patients experienced some
similar neurological symptoms such as temporary neuromuscular impairment.

“The sounds of talking and coughing are both influenced by the vocal cords
and surrounding organs. This means that when you talk, part of your talking
is like coughing, and vice versa. It also means that things we easily derive
from fluent speech, AI can pick up simply from coughs, including things like
the person's gender, mother tongue, or even emotional state. There's in fact
sentiment embedded in how you cough,'' Subirana says. “So we thought, why
don't we try these Alzheimer's biomarkers [to see if they're relevant] for

A striking similarity

In April, the team set out to collect as many recordings of coughs as they
could, including those from Covid-19 patients. They established a website
where people can record a series of coughs, through a cellphone or other
web-enabled device. Participants also fill out a survey of symptoms they are
experiencing, whether or not they have Covid-19, and whether they were
diagnosed through an official test, by a doctor's assessment of their
symptoms, or if they self-diagnosed. They also can note their gender,
geographical location, and native language.

To date, the researchers have collected more than 70,000 recordings, each
containing several coughs, amounting to some 200,000 forced-cough audio
samples, which Subirana says is “the largest research cough dataset that we
know of.'' Around 2,500 recordings were submitted by people who were
confirmed to have Covid-19, including those who were asymptomatic.

The team used the 2,500 Covid-associated recordings, along with 2,500 more
recordings that they randomly selected from the collection to balance the
dataset. They used 4,000 of these samples to train the AI model. The
remaining 1,000 recordings were then fed into the model to see if it could
accurately discern coughs from Covid patients versus healthy individuals.

Surprisingly, as the researchers write in their paper, their efforts have
revealed “a striking similarity between Alzheimer's and Covid

Without much tweaking within the AI framework originally meant for
Alzheimer's, they found it was able to pick up patterns in the four
biomarkers—vocal cord strength, sentiment, lung and respiratory
performance, and muscular degradation—that are specific to Covid-19. The
model identified 98.5 percent of coughs from people confirmed with Covid-19,
and of those, it accurately detected all of the asymptomatic coughs.
“We think this shows that the way you produce sound, changes when you have
Covid, even if you're asymptomatic,'' Subirana says.

Asymptomatic symptoms

The AI model, Subirana stresses, is not meant to diagnose symptomatic
people, as far as whether their symptoms are due to Covid-19 or other
conditions like flu or asthma. The tool's strength lies in its ability to
discern asymptomatic coughs from healthy coughs.

The team is working with a company to develop a free pre-screening app based
on their AI model. They are also partnering with several hospitals around
the world to collect a larger, more diverse set of cough recordings, which
will help to train and strengthen the model's accuracy.

As they propose in their paper, “Pandemics could be a thing of the past if
pre-screening tools are always on in the background and constantly

Ultimately, they envision that audio AI models like the one they've
developed may be incorporated into smart speakers and other listening
devices so that people can conveniently get an initial assessment of their
disease risk, perhaps on a daily basis.

This research was supported, in part, by Takeda Pharmaceutical Company
Read the whole story
749 days ago
Share this story
Next Page of Stories