Both doctors and dieticians like to stress the importance of eating fruits and vegetables as they fill our bodies with nutrients, vitamins and fibers. But despite this, a recent analysis conducted by the Centers for Disease Control (CDC) has revealed that most of us still don’t consume anywhere near enough fruits and vegetables.
Americans are in great need of a lifestyle change, and it’s not just kids and teens who become fussy eaters once someone brings up healthy greens, yellows and reds. The results showed that just one in five (1 in 5) American adults eat the recommended daily amount of fruits and vegetables advised by the government dietary suggestions.
To be specific, the government dietary suggestions say that people should ideally eat something like 1.5 or 2 cups of fruit on a daily basis, and something like 2 or 3 cups of vegetables on a daily basis. But not only do most American adults fail to meet both of these standards, they actually prefer the reverse, as more people are willing to consume a daily dose of fruits, not vegetables.
The Centers for Disease Control (CDC) researchers did find that the consummation rate is different from one state to the next, but the overall average isn’t very sunny. Statistically speaking, 87 percent (87%) of American adults don’t consume enough fruit on a daily basis, and 91 percent (91%) of American adults don’t consume enough vegetables on a daily basis.
California turned out to have the highest consumption rate for both fruits and vegetables. The state has 13 percent (13%) of its residents consuming the required amount of vegetables on a daily basis, and 18 percent (18%) of its residents consuming the required amount of fruit on a daily basis.
On the other hand, Mississippi was revealed to be the state with the lowest percentage of residents who consume the required amount of vegetables on a daily basis – just 6 percent (6%) – and Tennessee was revealed to be the state with the lowest percentage of residents who consume the required amount of fruit – just 8 percent (8%).
To reach this conclusion, the Centers for Disease Control (CDC) researchers studied a survey conducted just a couple of years ago, in 2013. They looked at the dietary habits of no less than 373.000 American adults, from all over the 50 states.
The undeniable conclusion was that there’s only an average of 13 percent (13%) American adults who meet the fruit standards set by the government dietary suggestions, and there’s only an average of 9 percent (9%) American adults who meet the vegetable standards set by the government dietary suggestions.
The experts concluded that because fruits and vegetable are an essential part of people’s diet and they affect many different health outcomes, healthcare workers have to continue carrying out efforts that try to increase the consumption of these goods.
Latetia V. Moore, member of the National Center for Chronic Disease Prevention and Health Promotion from the CDC, gave a statements stressing that consuming the appropriate amount of fruits and vegetables every day can help prevent the development of obesity, stroke, heart disease, and even certain cancers.
Image Source: sunnytatra.com
Google’s Gmail Postmaster Tool is expected to further lower spam percentages.
According to a recent report published by the U.S. company Symantec, email spam rates are now lower than ever. Compared to the previous report that the enterprise published in 2003, spam emails only cover 49.7 per cent of emails; thus showing that the recent measures have helped reduce spamming by 50 percent.
Tech companies have been going through many efforts in the past decade to lower Internet attacks and spam messages. Their efforts seem to have paid off as the new report published by Symantec reveals that the percentage of spam emails has been halved.
The study shows that only 49.7 of the messages in our email boxes represent spam letters and neither of them are as annoying as the well-known money-begging messages. These statistics come to confirm users that the recent measures that Google has adopted, such as, the introduction of their Gmail Postmaster Tools, have been effective.
The research has also showcased the evolution of spamming in the past 12 years. It appears that spam rates have remained equally high within areas related to retail, non-traditional services and public administration. These domains have registered a slight increase in spam messages in the period between April and May.
According to Symantec’s data, spam messages in the retail field have increased from 52.1% in April to 53.1% in May. A similar growth was noticed among services-related messages, which went from 52.6% in April to 53% in May. The public administration sector is currently responsible for 52.3% spamming messages, up almost 1% from May’s 51.4% value.
Further researches have also enabled Symantec employees to come up with the profile of the most popular spam producer. According to their description, small companies that have between 1 and 500 employees are more likely to send spam messages. These have been responsible for more than 53 percent of the spam messages that were sent in these past 12 years, researchers have explained.
At the opposite pole, lie big companies, which rarely send spam messages, according to the observations that Symantec has made. On the contrary, enterprises with 500-2,500 employees do not exceed the 50% spamming threshold.
This percentage is expected to further decrease in the following period due to the introduction of Google’s Postmaster Tools. This email service allows certified e-mail senders to better analyze their messages and to eliminate all possible errors and spams.
Image source: www.techspot.com
There may be more differences than we believed.
A science report in Nature Communications argues that our hands are primitive compared to a chimp. Even though we consider ourselves superior in many ways to our closest living relative in the animal kingdom it appears that the chimpanzees have the upper hand in this matter.
Over the years scientist believed that the human hand, capable of highly precise and complex actions, represented one of the pinnacles of our evolution and that it has long since surpassed that of the primates. However, a research group led by Sergio Almécija, now working for the George Washington University Centre, claims that this belief is at least partially wrong.
Humans where thought to have properly developed their hands after they started using basic tools made from stone. But Almécija’s team has compared the ratios of the different parts of our hands and arms to those of the chimpanzees and apes. The findings suggest that little has changed from even before humans picked up the first tool.
We have very similar hands to those of our hominoid ancestors, while the chimpanzee underwent a significant evolution in this part of the body. Their hands are significantly more specialized than ours, with longer fingers compared to their thumbs. They probably developed this feature after we stopped evolving our arms and legs.
Their hands serve as their tools, allowing them to have an easier time reaching for food and climbing trees. Humans on the other hand went through major neurological evolutions that allowed them to create stone weapons and other equipment that was vital to their survival, while their arms remained mostly unchanged.
Also, gorillas do have very similar hands to ours, indicating that land based primates may have stopped evolving their limbs altogether a long time ago (though gorillas do have one big opposable toe). Almécija believes his research proves that the common ancestor for the both humans and chimps was more human-liked, and that the chimp evolved on a different path while we developed our brains.
On the other hand, Adrienne Zihlman from the University of California challenges the findings of Almécija and his team. She claims the research is only based upon the different proportions of hand bones alone and that a lot more data is required to draw a pertinent conclusion regarding the main ancestor of the primates.
Nevertheless, the research does provide a new point of view in regards to our evolution as a species.
Image Source: www.middaydaily.com
Risk of Alzheimer’s disease might be predicted by a new non-invasive test developed by researchers from the University of Texas.
While fairly affordable, the test has identified that people who suffer from amnestic mild cognitive impairment (aMCI) have a different and specific variation in brain waves that’s consistent with a doubled risk of developing Alzheimer’s in their age group.
Experiments detected a sort of delayed neural activity that is known to be the cause of severe impairment of one’s cognitive performance. This ability was tested with the help of a word finding task and it led scientists to believe it could indicate an early onset of the neurocognitive regression that Alzheimer’s disease comes with.
One of Alzheimer’s hallmark symptoms is impaired episodic memory, which messes with the patient’s ability to remember new memories – including recent events, conversations or upcoming meetings.
The new approach in diagnosing the disease uses electroencephalogram (EEG) technology, a non-invasive alternative that might prove more affordable for the everyday people dealing with Alzheimer’s; this technology basically measures how the participant’s brain waves react when they try to access long-term or semantic memory.
According to one of the leading researcher John Hart, head of Medical Science at the Centre for Brain Health, this is a promising start in applying the technology on patients with mild cognitive impairment (MCI), who appeared to take their time when asked to do the semantic memory task; their results were also less accurate than those offered by the healthy control group.
During the study, researchers monitored by EEG 16 individuals with MCI and 17 healthy individuals who were used as controls, as they were showed pairs of words that either led the subject to think of an object, or were just randomly paired.
For example, ‘summer’ and ‘frozen’ would have the participants think of the word ‘ice cream’, but pairing ‘summer’ and ‘monitor’ together would be just a random pair. All the participants had to do was to press a button whenever the presented pair would evoke a particular object memory.
Previous studies related to EEG have focused on observing the mind ‘at rest’, but the new study was keen on monitoring the brain precisely when it was engaged in the process of retrieving object memories.
Senior author Hsueh-Sheng Chiang, a post-doctoral fellow at UT Southwestern Medical Centre, explained that certain cognitive deficits – such as semantic memory issues – are easier to be identified by using this sensitive technology, as EEG looks directly at the neural activity.
Image Source: Daily Mail
Facebook introduces new shopping section in order to stimulate e-commerce companies’ engagement on the sociaal network.
As of today, you may begin to make your shop lists. Facebook introduces new “Buy” button, according to a recent press release published on BuzzFeed. The news have been confirmed by the social network, who has further stated that the company plans to change their pages into regular mini storefronts.
Facebook is no longer a social network, if we were to judge by the wide palette of online services that the company guarantees to its users. The most recent announcement related to a possible change that the company is making refers to the introduction of a new “Buy” button.
Many other shopping-related changes will later on, be effected by developers at the social network in order for Facebook pages to become regular mini storefronts. Mark Zuckerberg, the founder of the social network, has confirmed the recent rumor and has even provided us with some insights on the new looks of Facebook.
According to Zuckerberg, the new “Buy” button will be available both on the desktop and the mobile versions of the platforms. The shopping feature is currently tested by a small group of users and e-commerce platforms, the CEO stated. Once the storefronts become reliable enough, the new feature will be available for all Facebook users, even though release dates may vary from one country to another.
Based on the description provide by the social network, a new “shopping” section will be added to the mobile version of the platform, where users can get the full view of the products they are interested in. The desktop version, on the other hand, will be displayed along the About, Photos and Timeline toolbar as a distinct tab.
While the shopping section may still suffer modifications in the following period, based on the feedback that developers receive from their e-commerce clients, we could still make an impression on the general looks of the page. The picture layout of the shopping section is very similar to the structure that developers have used so far for general photos.
Users will also have the possibility to choose whether they want to place their order on the Facebook page or whether they want to access the company’s official website for purchases. If they, nonetheless, choose to remain on Facebook’s page, they will be guided all throughout the ordering process with the help of the “Buy” button.
No other details have been provided in relation to the type of customers that will make use of the shopping section, nor have the names of the test participants have been revealed. The new change has been made to stimulate engagement on behalf of users, company representatives have concluded.
Similar endeavors have been carried out in the past, as well. Facebook has also introduced the peer-to-peer payment for their Messenger after the introduction of its business version of Messenger.
Image source: g.twimg.com
It’s no secret that Facebook is trying to dip its hands into YouTube’s revenue by becoming a worthy alternative when it comes to videos. Over the last two years, the social network has been constantly releasing new features designed to attract more users into watching videos directly on the platform instead of going to YouTube.
Along with the new features, Facebook’s improved News Feed algorithms have also appealed to advertisers, celebrities and media outlets by rewarding them for posting directly on its site. And this cocktail of strategies seems to be paying off, if we’re to believe Facebook’s data.
Ever since 2013, the number of videos posted by an individual user has skyrocketed to 94 percent in the United States and 75 percent on a worldwide level. The greatest increase, however, was recorded by the number of paid and non-paid videos that shows up in your News Feed – rising to 360 percent!
According to comScore’s data, YouTube still occupies no-1 in terms of video sites in the US, but Facebook is following closely on second place, closing the gap with each year that passes. There are several changes and updates that have made Facebook more similar to YouTube.
One of the greatest successes recorded was the auto-play tool, which turned the user’s News Feed in a livelier environment. This feature launched in the second half of 2013 is only available for videos uploaded straight on Facebook.
Some found the auto-play tool annoying, but most agreed that it was even more annoying to have to click play every time you wanted to watch the video again. Views have also increased, exceeding the milestone of 4 billion times a day in only 6 months of 2015.
Another trick that has helped Facebook move closer to YouTube is showing how many times a video was viewed. When you see a video was viewed 10 million times, you are more compelled to stop scrolling and watch it as well. This strategy has worked well for YouTube and many social video platforms have also adopted it.
At the same time, Facebook’s News Feed algorithm is trained to feed your habits – especially if you watch a lot of videos on its platform. It’s a known fact that Facebook can track your viewing practices, like how long you watch a video, or how often you expand the video player.
Apart from these successful new features, Facebook has one ace up its sleeve giving it an immense advantage over YouTube: the ability to share videos poses a true threat to YouTube’s monopoly. Facebook also has the advantage of being perceived as a social network, something YouTube still struggles with; most users see it as a giant video search engine.
Image Source: iProspect
The Food and Drug Administration brings good news for people who prefer to remain ignorant about their calorie intake when they go to out to eat, while people who hoped to be able to know exactly what goes on behind diners’ closed kitchen doors will have to wait a while longer.
And the reason why you’ll have to whistle on it some more before you can be fully confident in what you order is because the FDA was put under a load of pressure by hundreds of thousands of major and small businesses all across the country. Cracking under the pressure, the agency agreed to extend the deadline for 2016.
The controversial FDA ruling says that a new law will be adopted forcing all of America’s grocery stores, restaurants, and movie theatres to make the calorie content public of all the food products they sell.
At first, the law was to be enforced starting December this year, but this deadline was apparently not far enough for the industry to either make the necessary changes in the food it offers or to make sure they would provide accurate calorie listings.
Michael Taylor, the FDA deputy commissioner for foods and veterinary medicine, explained the delay by saying that more time is needed for the agency to be able to offer help and assistance so all businesses will comply with the new requirements.
Both Congress and major industry players have insisted with the FDA that there isn’t enough time and the rules are not clear enough.
There was a general confusion as to which food items needed to be presented with calorie count in the menus, so a new December 2016 deadline was set. The FDA also reported it will provide more comprehensive guidelines with simpler instructions for all businesses to get it right.
Many restaurants and fast food companies have already played by the rules and had their own calorie-counts ready to go, but there are still plenty of industry brands that are doing all they can to avoiding any revelations. (Yes, I’m looking at you, Domino’s Pizza – stop insisting that your clients normally order by phone and get those nasty calories down in your menu!)
Image Source: Angry Fitness Trainer
This is not a car, it’s a beast.
It’s been some time since we got excited about a car, but we’re really excited about this: the roaring Mercedes-Benz CLS 63 AMG S. It will make you smile. It will make you frown. It might even make you a bit… uneasy.
Now, if you happen to be driving down the road in one of these monsters, you might want to make sure you are pretty relaxed beforehand, since the looks you get from other drivers around you might not be exactly what you’d hoped.
You’re going to get a lot of smirks thinking you’re some sort of unhealthily rich guy, which you probably are. People driving Toyotas will undoubtedly show disgust on their facial expressions, since your car may well be the equivalent of a leaking nuclear power plant, in terms of Eco-friendliness – it probably is. Two students chatting about Nietzsche on a bench may literally cover their ears for fear of going deaf, since your engine noise is so loud, it could play lead guitar in a heavy metal band.
And really, you can’t possibly blame these reactions. No, you can’t. Then again, there is still something undeniably captivating about shooting down the street in one of these mean white stallions. Perhaps the people on the street ought to be thinking what their inner child would do with one.
The three-spiked star sits gloriously in the front. Behind it, there lay a 5.5 liter V8 and two turbocharges which, really, are like Superman’s X-ray vision – it could do just fine without another superpower. But it doesn’t want two have just one trait, it needs to have it all. And the engineers have made sure of this – the car delivers 5500 rpm, it goes to 60 in 4.1 seconds. It has two top speeds, 250 km/h or, if you ask nicely of the guys or girls at the factory, they will remove the limiter, and you can go to 300.
But here come the bad parts. It’s a luxury car – and because of this, the makers thought you probably don’t care what you spend on fuel. It gobbles up about 10 liters per kilometer, the company says, but recent test put it at around 15.5. If you’re feeling lucky, you’d better strike a deal with an Arab oil company yourself, you’re going to need it.
That is not all. There is a problem with its beast-like behavior. When you accelerate a bit, it wobbles like a maniac. And if it’s wet on the road, you’d best have a clean soul, for you may need it.
We’re going to leave the price to you, since we don’t want to pose a health risk. What can be said, is that it should come with driving lessons, or a personal driver, because you could afford to pay one for many, many years with the money you would spend on this.
Image source: mnmcdn.com
Bend the laws of physics? More like rewrite them.
Scientists all over the world are reconsidering what they previously thought about the laws that govern our physical universe. Laws of physics beware: there’s a black hole bigger than its galaxy – and it’s growing. Astrophysicists are, quite rightfully, scratching their heads and shrugging their shoulders.
Black holes are already known to be ginormous areas in space. They have a gravitational pull which is so strong, they can literally absorb light. If you didn’t guess it, that’s why they’re black. Supermassive black holes are those of them which are situated at the middle of galaxies. Galaxies like our own Milky Way, yet the size of the closest known black hole is relatively small compared to this plus-sized monster found in the galaxy CID-947.
The biggest contradiction this discovery brings to the known theories about black holes is that they don’t necessarily expand at the same rate. In this particular case, the black hole is growing much, much faster than starry sky that makes up its quite skinny galaxy.
The galaxy, and its corresponding black hole, was formed in the very early years of the universe. That’s the Big Bang plus two billion years. It may seem like a lot, but in universe-wise, that’s like a two-week old baby. An international study group made this surprise discovery while looking for average black holes in a mapping project for the supermassive black holes and how they grow or evolve across long periods of cosmic time.
Among the biggest black holes ever discovered, this one measures nearly 7 billion times the mass of our star. That’s especially fascinating since the galaxy around it is relatively normal, weighing just about as much as a typical galaxy.
Another breakthrough discovery that contradicts previous known facts was that there were still stars forming within the distant galaxy. The researchers thus maintain that it will continue to grow, despite its strangeness and the size of its black hole.
Scientific speculation has prompted those behind the discovery to say that it may be a predecessor of the more extreme and massive neighboring galaxies like NGC 1277, from the constellation Perseus, situated just two hundred and twenty light years away from our own.
The biggest thanks for this discovery, the scientists said, are to go to the Hawaii Observatory, W.M. Keck and to the Chandra COSMOS survey, both having aided them greatly. They especially mentioned MOSFIRE, the new infrared spectrometer from Keck.
Image source: nature.com
IBM was the first to introduce the 7nm chip.
As the famous computer company launched a new product, here are the most important things you need to know about IBM’s 7nm computer chips. The enterprise is now under the media’s attention because they were the first to introduce a tiny, yet functional and powerful silicon chip on the market.
The manufacturing of the silicon chip was made possible through a recent collaboration between IBM, Samsung, GobalFoundries and SUNY. The four companies have joined hands for one of the most daring initiative in the technological field.
Thanks to the new achievement that IBM has made, the standard size of the silicon chip has been significantly reduced. Silicon chips that were initially 10nm large could soon be reduced to just 7nm. The good news, however, is that the chips are functional and they could soon be used for the production of highly advanced technological devices.
In spite of the positive outcome that the chips have had during tests, IBM has further disclosed that the 7nm-wide chip will undergo a series of experiments and tests in the next two years. Scientists have praised IBM’s achievement because it is for the first time that such a tiny chip will be introduced on the market.
In addition, the chip is the first to be created out of silicon-germanium and the first to incorporate extreme ultraviolet (EUV) lithography in its production process. The silicon-germanium composition is much more powerful than the silicon material that was once used.
The incredibly small transistors could only be recreated with the help of the extreme lithography, which is why scientists have opted for this method, instead of others. IBM developers have also used a quadruple-patterning to properly arrange the transistors on the chip.
The company has stated that the size of the chip has been reduced by 50%, thus making commercial models a lot smaller. The regular size is of 14nm and scientists currently aspire to lower the size to 10nm.
A 7nm commercial model would definitely represent the beginning of a new era for silicon chips and for the additional products that may be created with their help.
Image source: arstechnica.net