What can we compare human storage capacity with

What can we compare human storage capacity with

New Estimate Boosts the Human Brain’s Memory Capacity 10-Fold

A new study has found the brain’s information storage capacity may be around a quadrillion bytes

What can we compare human storage capacity with. Смотреть фото What can we compare human storage capacity with. Смотреть картинку What can we compare human storage capacity with. Картинка про What can we compare human storage capacity with. Фото What can we compare human storage capacity with

«data-newsletterpromo_article-image=»https://static.scientificamerican.com/sciam/cache/file/4641809D-B8F1-41A3-9E5A87C21ADB2FD8_source.png»data-newsletterpromo_article-button-text=»Sign Up»data-newsletterpromo_article-button-link=»https://www.scientificamerican.com/page/newsletter-sign-up/?origincode=2018_sciam_ArticlePromo_NewsletterSignUp»name=»articleBody» itemprop=»articleBody»>

The human brain’s memory-storage capacity is an order of magnitude greater than previously thought, researchers at the Salk Institute for Biological Studies reported last week. The findings, recently detailed in eLife, are significant not only for what they say about storage space but more importantly because they nudge us toward a better understanding of how, exactly, information is encoded in our brains.

The question of just how much information our brains can hold is a longstanding one. We know that the human brain is made up of about 100 billion neurons, and that each one makes 1,000 or more connections to other neurons, adding up to some 100 trillion in total. We also know that the strengths of these connections, or synapses, are regulated by experience. When two neurons on either side of a synapse are active simultaneously, that synapse becomes more robust; the dendritic spine (the antenna on the receiving neuron) also becomes larger to support the increased signal strength. These changes in strength and size are believed to be the molecular correlates of memory. The different antenna sizes are often compared with bits of computer code, only instead of 1s and 0s they can assume a range of values. Until last week scientists had no idea how many values, exactly. Based on crude measurements, they had identified just three: small, medium and large.

But a curious observation led the Salk team to refine those measurements. In the course of reconstructing a rat hippocampus, an area of the mammalian brain involved in memory storage, they noticed some neurons would form two connections with each other: the axon (or sending cable) of one neuron would connect with two dendritic spines (or receiving antennas) on the same neighboring neuron, suggesting that duplicate messages were being passed from sender to receiver. Because both dendrites were receiving identical information, the researchers suspected they would be similar in size and strength. But they also realized that if there were significant differences between the two, it could point to a whole new layer of complexity. If the spines were of a different shape or size, they reasoned, the message they passed along would also be slightly different, even if that message was coming from the same axon.

So they decided to measure the synapse pairs. And sure enough, they found an 8 percent size difference between dendritic spines connected to the same axon of a signaling neuron. That difference might seem small, but when they plugged the value into their algorithms, they calculated a total of 26 unique synapse sizes. A greater number of synapse sizes means more capacity for storing information, which in this case translated into a 10-fold greater storage capacity in the hippocampus as a whole than the previous three-size model had indicated. “It’s an order of magnitude more capacity than we knew was there,” says Tom Bartol, a staff scientist at the Salk Institute and the study’s lead author.

But if our memory capacity is so great, why do we forget things? Because capacity is not really the issue, says Paul Reber, a memory researcher at Northwestern University who was not involved in the study, “Any analysis of the number of neurons will lead to a sense of the tremendous capacity of the human brain. But it doesn’t matter because our storage process is slower than our experience of the world. Imagine an iPod with infinite storage capacity. Even if you can store every song ever written, you still have to buy and upload all that music and then pull individual songs up when you want to play them.”

Reber says that it is almost impossible to quantify the amount of information in the human brain, in part because it consists of so much more information than we’re consciously aware of: not only facts and faces and measurable skills but basic functions like how to speak and move and higher order ones like how to feel and express emotions. “We take in much more information from the world than ‘what do I remember from yesterday?’” Reber says. “And we still don’t really know how to scale up from computing synaptic strength to mapping out these complex processes.”

The Salk study brings us a bit closer, though. “They’ve done an amazing reconstruction,” Reber says. “And it adds significantly to our understanding of not only memory capacity but more importantly of how complex memory storage actually is.” The findings might eventually pave the way toward all manner of advances: more energy-efficient computers that mimic the human brain’s data-transmission strategies, for example, or a better understanding of brain diseases that involve dysfunctional synapses.

But first scientists will have to see if the patterns found in the hippocampus hold for other brain regions. Bartol’s team is already working to answer this question. They hope to map the chemicals, which pass from neuron to neuron, that have an even greater capacity than the variable synapses to store and transmit information. As far as a precise measurement of whole-brain capacity, “we are still a long way off,” Bartol says. “The brain still holds many, many more mysteries for us to discover.”

How does the human brain compare to a computer?

Posted on August 28, 2019 | Updated April 20, 2020 by Kris Sharma in Technology

We live in a world where computers can outperform humans at chess, Go, and even Jeopardy. Artificial intelligence and machine learning are creating new breakthroughs all the time, leaving us wondering whether we’ll soon be living in a technological utopia or battling for survival against a cyborg Arnold Schwarzenegger.

But do computers outperform the human brain overall? Let’s find out.

For the purpose of this article, let’s define a computer as a personal desktop for non-professional use (i.e. not a server running 24/7).

And to keep things simple, we’ll limit the comparisons to four areas:

Let the battle begin!

Storage

For day-to-day usage, most computer users will get by with 500GB of storage. Creatives, gamers, and other data-heavy users will often rely on additional storage on the cloud or on a portable SSD. For the sake of argument, we’ll give the computer an average of 1TB of storage space.

What about the brain’s storage capacity? Well, it’s complicated.

Estimates vary on how many nerve cells, or neurons, exist in a typical brain. Many studies rely on 100 billion neurons, while a Stanford University study estimates that the brain actually has 200 billion neurons.

You might be thinking, “Wait, the computer has bytes and the brain has neurons. How do we compare the two?”

One marked difference between the human brain and computer flash memory is the ability of neurons to combine with one another to assist with the creation and storage of memories. Each neuron has roughly a thousand connections to other neurons. With over a trillion connections in an average human brain, this overlap effect creates an exponentially larger storage capacity.

Based on our understanding of neurons today, which is very limited, we would estimate the brain’s storage capacity at 1 petabyte, which would be the equivalent of over a thousand 1TB SSDs.

Advantage: Human Brain.

Memory

So far, it’s an even contest. The human brain has significantly more storage than an average computer. And a computer can process information exponentially faster than a human brain.

How about accessing memory? Can a human recall information better than a computer?

Well, it depends on what kinds of information we’re talking about.

For basic facts, the answer is unequivocally no. If a computer “knows” that the capital of Nevada is Carson City, that fact will always be accessible. A human, on the other hand, may get confused or forget that fact over time, particularly after a long weekend in Vegas.

Where computers lag behind humans is the ability to assign qualitative rankings to information. For a computer, all information is exactly the same. Humans, on the other hand, have many different types of memories and prioritize memories based on their importance. You will undoubtedly remember numerous details about your wedding day, but you probably forgot what you had for lunch last Thursday. (It was a tuna sandwich on rye, in case you were wondering.)

Humans also relate memories to one another, so your memory of New Year’s Eve will tie to all of your other New Year celebrations over the course of your life. A computer lacks this ability, at least for now.

Energy Efficiency

The contest is still a toss-up. Computers are faster and more precise, while humans have more storage capacity and nuance in accessing memories.

What about energy efficiency? Here is where it gets really fun.

A typical computer runs on about 100 watts of power. A human brain, on the other hand, requires roughly 10 watts. That’s right, your brain is ten times more energy-efficient than a computer. The brain requires less power than a lightbulb.

We may not be the brightest bulbs in the box, but then again, we don’t have to be.

Advantage: Human Brain

Conclusion

Ultimately, there is no clear winner overall. Human beings and computers have their own advantages, depending on the category. If you want precision and raw processing speed, a computer is the clear choice. If you want creativity, energy efficiency, and prioritization, a human is your best bet.

The good news is that we don’t have to choose. It doesn’t have to be a contest of humans against computers. We can work together and enjoy the best of both worlds. That is, until Skynet becomes self-aware.

Most of us know that hard drive storage capacity in our pc’s, laptops, etc is measured in millions of (or mega) bits of information, commonly referred to as MegaBytes (Mb) and GigaBytes (Gb). Current hard drive technology is capable of storing in excess of 500 Gb of data and some have even exceeded 2 TerraBytes (Tb) of storage capacity.

Has anyone come across any studies that can put a scientifically proven DEFINITIVE number on how much data the human brain can store or is it simply a case of «we know the brain stores data/information but we’re clueless as to an upper limit».

It is unknown but there are some guesses, you can look first to the capacity of the neuron and then to a bunch of interconnected neurons, the capacity of a neuron is not 1 o 0 like a computer and it is potentiated but the number of connections. So the capacity is dynamic and evolves since it not only optimizes for redundancy, similarity but connections (between concepts), this is why knowledge capacity in a brain cannot be directly measured or its location constrained like in a computer file, add to this the input and output fuzziness and any test would be extremely hard to perform at best you will get some estimates but I wouldn’t put much faith in them.

There is even a theory that nothing that you see is truly forgotten (unless the brain is damaged), it just never gets out into you consciousness file-system.

Another good trait that a brain has, is its deducting and speculative imagination capabilities. It can with a above normal error probability generate valid concepts and answers to something that it really had no express knowledge of.

I think our capacity is what we make it. There are times where we’d like to forget things and there are times where we’d try and learn a new math formula and we start off saying, «I won’t remember all of this crap!» So we forget events as time passes and of course there is amnesia, but even then some recover.

I like to think my mind has checkpoints of memory and within each checkpoint there are files of nouns that I remember, ya know, places, people, and things. Then I try and pinpoint the time, usually the year I was engaged with a noun. Then jog from that point.

Sometimes we may see something that reminds of a person or event that we have completely forgotten. Like last week, I was at work and I seen a kid reading «Lady and the Tramp» I remembered a plethora of things and events unrelated to the book, but the book reminded me of back when I had the book and brought me to a checkpoint within my mind that I had completely forgotten.

So I believe we don’t truly have a capacity, at least until we die of course.

What is the information storage capacity of the human brain?

Related/bonus points: I seem to remember reading about some equation that states the amount of information that can be held by a neural network with n neurons in it arranged in l layers, or something vaguely like that (n and l probably weren’t even the letters in it.) Can anyone help me remember what I’m thinking of?

(The brain is a very large neural network. So, if we have an equation for neural networks, we should be able to get an estimate of the information contained in the human brain’s neural network.)

1 Answer 1

Disclaimer: Quantifying the capacity of the human brain is quiet complex as you might imagine. And although in cognitive neuroscience we often compare the brain to computers this is not an exact comparison, in many ways the brain is far more complicated and encodes information in a very different way than the comparison of CPU processors and hard-drives. The short answer is that we have understandings about capacity under specific situations regarding STM, but LTM capacity is mostly based on estimates.

TL;DR:

Estimates of neural processing capacity range from 10^18 to to 10^25 FLOPS, and this range too might ought be taken with a grain of salt.

The long version:

Memory in the brain is often split by cognitive psychologists into several different modules, such as the memory storage model suggested by atkinson and shiffrin; attention->short-term(STM)->long-term(LTM).

This model is largely inaccurate as the brain encodes information relative the type of information received, for instance auditory inputs will be processed by neural areas associated with auditory processing first. In addition attentional processing, unlike in this model, relies heavy in precognitive processing e.g. if you are hungry you will attend more to food. That being said we still use STM and LTM to distinguish between memory that is in use and memory that is stored.

Short-term memory capacity

The rather brilliant researchers Baddley and Hitch developed what is perhaps the most compelling models for short-term memory processing. The working memory model (see Fig. 1) accounts for differences in information types and how it is held within the processing centres of the brain.

A meta-analysis of 400 studies has displayed good support for the three main modules of cognitive memory. Generally speaking the central executive can be considered as the processor, while while the episodic buffer, phonological loop and visit-spatial sketchpad may be the modular RAM which holds information for processing.

According to Baddley the phonological loop can hold about 2 seconds of auditory information, this would be a list of unrelated words with a task designed to restrict rehearsal and encoding of information. However if the information was related, say «Our lecturer told us to read chapter 3 of working memory» we could hold more of the information as its related and may be chunked together. ‘Chunking’ is a feature of memory that complies similar information together. The capacity recorded generally depends on numerous factors, such as the task type, time between learning and recall, and the significance of the information. In addition we can also add age and context (internal and external) as factors influencing memory recall. Overall we can’t identify the exact capacity of working memory due to the complexity of information processing, what we can say is that working memory deals with small amounts of information spit across different modules, and relates this to LTM. Although that small amount of information is probably a lot more than your average super computer can process as this post indicates.

Long-term capacity

As with STM, LTM is modular in anyways. However the total capacity is somewhat related too neurons, with 86-100 million neurons and 1000 glia cells thats means the human brain has a large capacity for storing information. However as previously mentioned these neurons relate to particular types of information. With regard to processing capacity the human brain is estimated to be [10 to the power of 17] 11 flops per sec, according to the blue brain project. Another recent estimate puts the processing capacity of the brain at 10^28 Flops.

Fig 3. A comparison of recent predictions of neural processing capacity and current fastest Super CPU What can we compare human storage capacity with. Смотреть фото What can we compare human storage capacity with. Смотреть картинку What can we compare human storage capacity with. Картинка про What can we compare human storage capacity with. Фото What can we compare human storage capacity with

We can use Traversed Edges Per Second (TEPS) to measure a computer’s ability to communicate information internally. We can also estimate the human brain’s communication performance in terms of TEPS, and use this to meaningfully compare brains to computers. We estimate that the human brain performs around 0.18 – 6.4 * 1014 TEPS. This is within an order of magnitude more than existing supercomputers. TEPS = synapse-spikes/second in the brain

= Number of synapses in the brain * Average spikes/second in synapses

= Number of synapses in the brain * Average spikes/second in neurons

So the brain operates at around 18-640 trillion TEPS, while closest supercomputer is 2.3 * 10^13 TEPS (23 trillion TEPS).

Memory has been calculated to be around 2.5 petabytes (2.5 * 10^15 bytes), as reported here (seems to be based upon speculation by Prof P Reber). Another estimate has the neural memory capacity at 8∙10^19 bits—that’s over 8 quintillion (10^18) bytes. Some researchers at Berkeley have suggested a relatively small 10-100 terabytes (10^13 to 10^14 bytes). All these estimates are based on variations in calculations relative to neuron density and and synapse connections across the whole brain. The larger estimates take addition account of other factors involved in neural communication. But the overall criticism I have is that we can’t simply say that one synapse is 1 byte or 200 calculations per second.

The term estimate is generous here; guesstimate would be far more accurate. Individual neurons are complicated enough; moving to distributed and interconnected networks in the brain is just another level. Neurons do not conduct calculations independently; they rely heaving on context and information types. So we can say the processing of the brain is modular, in fact we already know this to be true to initial sensory processing. There is also no clearly obvious separation of memory from processing, although we know in motivation for instance certain areas will activate (the so called ‘reward system’) when assessing motivational objects. We won’t say this is memory but it relies on it previous associations in memory. So some areas of the brain are used to assess rather than recall but they will activate together. The point being that we can’t bundle all neurons together to calculate memory or processing. We just don’t know what the vast amount of neurons are doing right now.

I highly recommend this report, for more information and references regarding computational and memory estimates.

The Human Brain is Loaded Daily with 34 GB of Information

What can we compare human storage capacity with. Смотреть фото What can we compare human storage capacity with. Смотреть картинку What can we compare human storage capacity with. Картинка про What can we compare human storage capacity with. Фото What can we compare human storage capacity with

© Dr Michel Royon / Wikimedia Commons

The deluge of information in modern times by the media and other information sources has led to daily “bombing” of the average human brain with such a large volume of information which could overload even a powerful computer, according to a new U.S. scientific research.

The study, conducted by researchers at the University of California-San Diego, under Roger Bon, according to the British “Times of London” and “Telegraph”, believes that people are every day inundated with the equivalent amount of 34 Gb (gigabytes) of information, a sufficient quantity to overload a laptop within a week.

This study has been conducted a few years ago so we are sure this number is much bigger in 2018-2019.

Through mobile phones, online entertainment services, the Internet, electronic mail, television, radio, newspapers, books, social media etc. people receive every day about 105,000 words or 23 words per second in half a day (12 hours) (during awake hours).

Although people can not really read these 105,000 words each day, this is the real number estimated to be reaching the human eyes and ears every day. After adding pictures, videos, games, etc., we reach the volume of 34 Gigabytes of information per day on average.

The total consumption of information from television, computers and other information was estimated (for the U.S.) to be 3.6 million gigabytes.

Traditional media (TV and radio) continue to dominate the daily flow of information with about 60% of hours’ consumption. The study considers that our brains are not directly threatened, but does not exclude a detrimental effect or information “flood” which might lead to different development path of the human brain by creating new neural connections, given that our brain has now shown that it is “malleable” and can be “wired” each time differently, depending on the quantity and quality of the stimuli it receives.

According to the researchers, the main effect of information overload is that the human attention to focus is continually hampered and interrupted all too often, which does not help in the process of reflection and deeper thinking.

As commented by the American psychiatrist Edward Hallowell, “never in human history, our brains had to work so much information as today. We have now a generation of people who spend many hours in front of a computer monitor or a cell phone and who are so busy in processing the information received from all directions, so they lose the ability to think and feel. Most of this information is superficial. People are sacrificing the depth and feeling and cut off from other people.”

In a more hopeful manner, neuroscientist and professor of physiology at Oxford University, UK John Stein, stressed that in the Middle Ages, when printing was invented, people were concerned that the human mind would not withstand a lot of information, which however was not true at the end.

Bono, who did the actual research, explained that the study did not record how much information from these daily 34 gigabytes eventually is absorbed by the brain. On the other hand, he pointed out that what has changed in modern times is mainly the nature of the information received, rather than its quantity. As he said, looking at either a computer screen or talking face to face with someone, in fact our brain can absorb the same amount of information.

A face to face conversation has its own equivalent of bytes of information (not known how much), since the brain monitors the expressions of other people, listens to the tones of their voice, etc.

According to the assessment of Bono, if such a face to face conversation causes our brain to store information with rate of 100 megabits per second, then a two-hour conversation with someone will store even more information than the estimated volume of electronic information received within a day in our brain.

How much information can the brain store?

So a question naturally comes to mind: If our brain is loaded with so much information every day, will it get full after some years? Paul Reber, professor of psychology at Northwestern University, states that our brain will certainly not get full in our lifetime.

Источники информации:

Добавить комментарий

Ваш адрес email не будет опубликован. Обязательные поля помечены *