#Bizwhiznetwork.com | Business Opportunities & Products Unfolded https://www.bizwhiznetwork.com - Expert's in Business Branding Service Wed, 24 May 2017 00:12:57 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.5 Stop oversimplifying everything! https://www.bizwhiznetwork.com/stop-oversimplifying-everything/ https://www.bizwhiznetwork.com/stop-oversimplifying-everything/#respond Tue, 23 May 2017 19:21:11 +0000 https://www.bizwhiznetwork.com/?p=17947 We all like things simple

Once upon a time, our world was simple. There was a thesis — “The Anatomy of a Large-Scale Hypertextual Web Search Engine” by Sergey Brin and Larry Page — that told us how Google worked. And while Google evolved rapidly from the concepts in that document, it still told us what we needed to know to rank highly in search.

As a community, we abused it — and many made large sums of money simply by buying links to their site. How could you expect any other result? Offer people a way to spend $2 and make $10, and guess what? Lots of people are going to sign up for that program.

Spammers are Relentless

But our friends at Google knew that providing the best search results would increase their market share and revenue, so they made changes continually to improve search quality and protect against attacks by spammers. A big part of what made this effort successful was obscuring the details of their ranking algorithm.

When reading the PageRank thesis was all you needed to do to learn how to formulate your SEO strategy, the world was simple. But Google has since been issued hundreds of patents, most of which have probably not been implemented and never will be. There may even be trade secret concepts for ranking factors for which patent applications have never been filed.

Hundreds of Patents Make for a Confusing Landscape

Yet, as search marketers, we still want to make things very simple. Let’s optimize our site for this one characteristic and we’ll get rich! In today’s world, this is no longer realistic. There is so much money to be had in search that any single factor has been thoroughly tested by many people. If there were one single factor that could be exploited for guaranteed SEO success, you would already have seen someone go public with it.

‘Lots of different signals’ contribute to rankings

Despite the fact that there is no silver bullet for obtaining high rankings, SEO professionals often look for quick fixes and easy solutions when a site’s rankings take a hit. In a recent Webmaster Central Office Hours Hangout, a participant asked Google Webmaster Trends Analyst John Mueller about improving his site content to reverse a drop in traffic that he believed to be the results of the Panda update from May of 2014.

The webmaster told Mueller that he and his team are going through the site category by category to improve the content; he wanted to know if rankings will improve category by category as well, or if there is a blanket score applied to the whole site.

Here is what Mueller said in response (emphases mine):

“For the most part, we’ve moved more and more towards understanding sections of the site better and understanding what the quality of those sections is. So if you’re … going through your site step by step, then I would expect to see … a gradual change in the way that we view your site. But, I also assume that if … you’ve had a low quality site since 2014, that’s a long time to … maintain a low quality site, and that’s something where I suspect there are lots of different signals that are … telling us that this is probably not such a great site.

(Note: Hat tip to Glenn Gabe for surfacing this.)

I want to draw your attention to the bolded part of the above comment. Doesn’t it make you wonder, what are the “lots of different signals?”

While it’s important not to over-analyze every statement by Googlers, this certainly does sound like the related signals would involve some form of cumulative user engagement metrics. However, if it were as simple as improving user engagement, it likely would not take a long time for someone impacted by a Panda penalty to recover — as soon as users started reacting to the site better, the issue would presumably fix itself quickly.

What about CTR?

Larry Kim is passionate about the possibility that Google directly uses CTR as an SEO ranking factor. By the way, do read that article. It’s a great read, as it gives you tons of tips on how to improve your CTR — which is very clearly a good thing regardless of SEO ranking impact.

CTR vs Ranking Position Chart

That said, I don’t think Google’s algorithm is as simple as measuring CTR on a search result and moving higher CTR items higher in the SERPs. For one thing, it would be far too easy a signal to game, and many industries that are well-known for aggressive SEO testing would have pegged this as a ranking factor and already made millions of dollars on this by now. Second of all, high CTR does not speak to the quality of the page that you’ll land on. It speaks to your approach to title and meta description writing, and branding.

We also have the statements by Paul Haahr, a ranking engineer at Google, on how Google works. He gave the linked presentation at SMX West in March 2015. In it, he discusses how Google does use a variety of user engagement metrics in ranking. The upshot of it is that he said they are NOT used as a direct ranking factor, but instead, they are used in periodic quality control checks of other ranking factors that they use.

How Google Uses CTR as a Ranking Factor

Here is a summary of what his statements imply:

  1. CTR, and signals like it, are NOT a direct ranking factor.
  2. Signals like content quality and links, and algorithms like Panda, Penguin, and probably hundreds of others are what they use instead (the “Core Signal Set”).
  3. Google runs a number of quality control tests on search quality. These include CTR and other direct measurements of user engagement.
  4. Based on the results of these tests, Google will adjust the Core Signal Set to improve test results.

The reason for this process is that it allows Google to run their quality control tests in a controlled environment where they are not easily subject to gaming of the algorithm, and it makes it far harder for black-hat SEOs to manipulate.

So is Larry Kim right? Or Paul Haahr? I don’t know.

Back to John Mueller’s comments for a moment

Looking back on the John Mueller statement I shared above, it strongly implies that there is some cumulative impact over time of generating “lots of different signals that are telling us that this is probably not such a great site.”

In other words, I’m guessing that if your site generates a lot of negative signals for a long time, it’s harder to recover, as you need to generate new positive signals for a sustained period of time to make up for the history that you’ve accumulated. Mueller also makes it seem like a gradated scale of some sort, where turning a site around will be “a long-term project where you’ll probably see gradual changes over time.”

However, let’s consider for a moment that the signal we are talking about might be links. Shortly after the aforementioned Office Hours Hangout, on May 11, John Mueller also tweeted out that you can get an unnatural link from a good site and a natural link from a spammy site. Of course, when you think about it, this makes complete sense.

How does this relate to the Office Hours Hangout discussion? I don’t know that it does (well, directly, that is). However, it’s entirely possible that the signals John Mueller speaks about in Office Hours are links on the web. In which case, going through and disavowing your unnatural links would likely dramatically speed up the process of recovery. But is that the case? Then why wouldn’t he have just said that? I don’t know.

But we have this seeming genuine comment from Mueller on what to expect in terms of recovery with no easily determined explanation of what signals could be driving it.

We all try to oversimplify how the Google algorithm works

As an industry, we grew up in a world where we could go read one paper, the original PageRank thesis by Sergey Brin and Larry Page, and kind of get the Google algorithm. While the initial launch of Google had already deviated significantly from this paper, we knew that links were a big thing.

The PageRank Paper Made SEO Easy

This made it easy for us to be successful in Google, so much so that you could take a really crappy site and get it to rank high with little effort. Just get tons of links (in the early days, you could simply buy them), and you were all set. But in today’s world, while links still matter a great deal, there are many other factors in play. Google has a vested interest in keeping the algorithms they use vague and unclear, as this is a primary way to fight against spam.

As an industry, we need to change how we think about Google. Yet we seem to remain desperate to make the algorithms simple. “Oh, it’s this one factor that really drives things,” we want to say, but that world is gone forever. This is not a PageRank situation, where we’ll be given a single patent or paper that lays it all out, know that it’s the fundamental basis of Google’s algorithm, and then know quite simply what to do.

The second-largest market cap company on planet Earth has spent nearly two decades improving its ranking algorithm to ensure high-quality search results — and maintaining the algorithm’s integrity requires, in part, that it be too complex for spammers to easily game. That means that there aren’t going to be one or two dominating ranking factors anymore.

This is why I keep encouraging marketers to understand Google’s objectives — and to learn to thrive in an environment where the search giant keeps getting closer and closer to meeting those objectives.

We’re also approaching a highly volatile market situation, with the rise of voice search, new devices like the Amazon Echo and Google Home coming to market, and the impending rise of the personal assistants. This is a disruptive market event, and Google’s position as the number one player in search as we know it may be secure, but search as we know it may no longer be that important an activity. People are going to shift to using voice commands and a centralized personal assistant, and traditional search will be a minor feature in that world.

What this means is that Google needs its results to be as high-quality as they possibly can make them. Yet they need to keep fighting off spammers at the same time. The result? A dynamic and changing algorithm that continues to improve overall search quality as much as they can. To maintain a stranglehold on that market share, and establish a lead, if at all possible, in the world of voice search and personal assistants.

What does it mean for us?

The simple days of gaming the algorithm are gone. Instead, we have to work on a few core agenda items:

  1. Make our content and site experience as outstanding as we possible can.
  2. Get ready for the world of voice search and personal assistants.
  3. Plug into new technologies and channel opportunities as they become available.
  4. Promote our products and services in a highly effective manner.

In short, make sure that your products and services are in high demand. The best defense in a rapidly changing marketplace is to make sure that consumers want to buy from you. That way, if some future platform does not provide access to you, your prospective customers will let them know.

Notice, though, how this recipe has nothing to do with the algorithms of Google (or any other platform provider). Our world is just no longer that simple anymore.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

]]>
https://www.bizwhiznetwork.com/stop-oversimplifying-everything/feed/ 0
Want more links? Be more likable. https://www.bizwhiznetwork.com/want-more-links-be-more-likable/ https://www.bizwhiznetwork.com/want-more-links-be-more-likable/#respond Tue, 23 May 2017 19:21:09 +0000 https://www.bizwhiznetwork.com/?p=17946

One of the things I like best about link building is that it really is as much art as it is science, if not more so. I do roll my eyes when people talk about “relationship building” — and I roll them when I write about it, too — but in many ways, it really is all about that. If you can’t make webmasters like you and what you’re proposing, you’re not going to get those links.

Following are some tips for being a more likable link builder:

Outreach

Don’t be a stalker. If there’s no contact information on the site, maybe that means they don’t want to be contacted. There are many ways of finding the contact info for a site, and I have certainly used them before. After realizing that it’s mostly just creepy, I’ve stopped.

These days, if we don’t find contact info on a site, we just assume we should not bother them. A lack of contact information on a site is the digital equivalent of a “Do Not Disturb” sign on a hotel door or a gated fence around a house. These people do not want to be bothered.

link stalker

If someone says, “go away and leave me alone,” accept it and move on. They do really mean it. I know I sure as heck mean it when I tell someone to stop emailing me. I occasionally get what turns out to be the fifth or sixth email from the same person — after I’ve asked them not to contact me again.

“Are you sure you won’t change your mind? Don’t you want to reconsider?” No, I really and truly don’t.

If you contact someone, and they ask you not to do so again, respect it. Add them to a list, to a “do not contact” database, whatever you need. While I do ask why someone turns down an offer we make sometimes, I never ask someone why they don’t want to be contacted again. I just accept it.

Don’t harass people endlessly and on every platform possible. This goes with the above point, but some people do take this too far. I’ve had people email me at my personal account, my business account, send me messages through Facebook, call the office and tweet to me, and that is just unnecessary.

This can border on serious harassment at times. What would you think if you told a friend you had other plans, checked in on Swarm, and lo and behold she saw your check-in and turned up at the restaurant where you were having dinner with other friends?

Discovery

If you see something that doesn’t look quite right to you, avoid it. I have really learned to rely on my gut instincts over the course of my career. If a site looks good, but I feel like I’m missing something, I probably am. The metrics are great, the relevancy is there, but I feel like I need to spend more time making sure this really is a good site.

Recently, a client and I agreed that a site we wanted to work with was good. The webmaster responded, and something in his email just seemed a bit off, so we dug in deeper and found some articles that made me rethink the whole thing due to their outbound links and poorly written content. The main pages that I’d seen had been great, but there were some older articles that looked like they’d once been a part of another site, written by another person, and they’d been hastily thrown in.

spam

Don’t make a judgment call based solely on appearances. This is true for an amazingly gorgeous site and one of those sites coded in Cold Fusion that looks like it hasn’t been updated since 2001. Just because a site is pretty doesn’t mean it’s a good fit for you; it may not help you gain rankings or traffic. And just because it’s hideous, that doesn’t mean it won’t rank well, help you rank better, and send you relevant traffic.

Negotiation

Stop trying to make something work that just isn’t going to happen. If a webmaster says no, accept it and don’t keep trying to find 10 different ways to make it work. Sometimes we do push back a little bit if we feel like the webmaster doesn’t fully understand what we’re asking for, but usually, if a webmaster says no, it’s because he or she doesn’t think it’s a good fit. You have to respect that, even if you do mutter to yourself all day afterward about it. You want a link. That can easily cloud your judgment at times.

rejection

Accept rejection and move on. I think this is much easier in business than in real life, of course. I also like to have reasons laid out for me, and that isn’t always possible. When a webmaster rejects my proposal, I do become offended, as much as I hate to admit it. However, I know there are other great sites out there, and I’d rather spend time finding them and negotiating a link than licking my wounds and badgering someone to tell me what I did wrong.

When I first started link building, I would get very upset if someone didn’t like what I was proposing. These days, I’m used to it and feel like it’s usually my fault in some way. Maybe I didn’t do enough due diligence. Maybe I wasn’t clear enough in my communications.

Closing thoughts

I don’t care what additional goods you’re offering someone other than your content, whether it’s free products or tons of cash. The majority of links you get are going to come from someone liking you and/or what you represent. Remember that if you take the view that link building is mostly relationship building, you have to approach it just as you would a real-life situation.

Of course, link building is more than just relationship building — but you still can’t discount the importance of building a relationship in order to get a link. It is a crucial part of an intricate and complex process.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

]]>
https://www.bizwhiznetwork.com/want-more-links-be-more-likable/feed/ 0
Google is testing 11 variations of black links in search results https://www.bizwhiznetwork.com/google-is-testing-11-variations-of-black-links-in-search-results/ https://www.bizwhiznetwork.com/google-is-testing-11-variations-of-black-links-in-search-results/#respond Tue, 23 May 2017 19:21:07 +0000 https://www.bizwhiznetwork.com/?p=17945

Google appears to be testing a new look for its search results pages that would change the color of links from the traditional blue to black.

First reports of the black SERP tests started to appear at the end of April, and Google already ran a similar test almost exactly one year ago. This time around, however, webmasters have noticed several different variations of the test, and we decided to try to find as many of them as we could.

We’ve found 11 variations thus far, and we are pretty confident there are more.

1. Large black URLs

These results feature the listing URL above the page title.

large black urls

2. Small black URLs

This display is similar to the one above, but the URL text is noticeably smaller.

small black urls

3. Thin blue bars

This display features a blue bar to the left of the search listing’s page title.

thin blue bar

4. Thick blue bars

Similar to the display above, this version features a thicker blue bar to the left of the page title in search results.

thick blue bar

5. Black horizontal dots

To the right of the URL below, you can see three small dots, aligned horizontally. Presumably, you can click this for more options or information.

black horizontal dots

6. Black vertical dots 

This display is similar to the one above, but the dots are aligned vertically.

black vertical dots

7. Blue titles

In the screen shot below, you can see that the page title link is displayed in the traditional blue color, while all other links (sitelinks and URL) are black.

blue title

8. Blue URLs

In this version, the URL is displayed in the traditional blue color, while the rest of the links (page title and sitelinks) appear black.

blue url

9. Colored dots

The display below features four colored dots to the right of the URL. The dots appear to be the same colors featured in Google’s logo.

colored dots

10. Dashed blue underline

This display shows the listing’s page title underlined with a blue dashed line. (h/t Aaron Dicks)

dashed underline

11. Globe icons

This test shows a blue globe icon in search results, above the page title and next to site breadcrumbs. (h/t Kenichi Suzuki)

glob icon serp test

What are you seeing?

Google itself claims to have run 9,800 live traffic experiments last year, and it seems they’re continuing that tradition this year in testing black URLs in search results. So far, we’ve identified at least 11 different variations of one test — and there may be more out there. Have you seen any versions we missed? If so, let us know on social media!


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

]]>
https://www.bizwhiznetwork.com/google-is-testing-11-variations-of-black-links-in-search-results/feed/ 0
Voice, mobile & apps: Get the latest on Google’s search developments at SMX https://www.bizwhiznetwork.com/voice-mobile-apps-get-the-latest-on-googles-search-developments-at-smx/ https://www.bizwhiznetwork.com/voice-mobile-apps-get-the-latest-on-googles-search-developments-at-smx/#respond Tue, 23 May 2017 19:21:04 +0000 https://www.bizwhiznetwork.com/?p=17944 Voice, mobile  apps – get the latest on Google’s search developments at SMX

Doubling down on voice

Between the many Google Home and Google Assistant announcements, it’s clear that marketers need to shift content creation strategies to focus not only on natural conversational language, but also the context of the conversation query — from which device and the time/location of a query.

Joe Youngblood noted in his recent column:

“Voice Search and Google assistant are exciting technologies that promise to make searching at home a more immersive and useful experience; however, they come with challenges we as marketers must be ready to face and find solutions to.”

At Search Engine Land’s SMX Advanced conference, Eric Enge will address these challenges and suggest opportunities for marketers in his “Attaining Position Zero: Featured Snippets In The Google Home Google Assistant World” session.

All the apps

Google also announced Instant Apps, not to be confused with PWAs (progressive web apps), native apps or AMP pages. As described in the article, Instant Apps are like “AMP for apps.” As Google continues the headlong push into what appears to be not just “mobile-first,” but mobile-only, understanding the differences between the platforms, technologies and their uses remains critical for marketers.

Our experts in Seattle will define these differences and discuss the ways that marketers can take advantage of them during these sessions:

Still wondering what to make of all the announcements? Come to the keynote with Google’s Gary Illyes. It’s an AMA (Ask Me Anything), so have your questions ready!

See you in Seattle!

Tickets are nearly sold out, so to secure your spot among the top experts and most advanced attendees, register now.


About The Author

]]>
https://www.bizwhiznetwork.com/voice-mobile-apps-get-the-latest-on-googles-search-developments-at-smx/feed/ 0
Google will automatically convert display ads to AMP, test AMP landing pages for Search https://www.bizwhiznetwork.com/google-will-automatically-convert-display-ads-to-amp-test-amp-landing-pages-for-search/ https://www.bizwhiznetwork.com/google-will-automatically-convert-display-ads-to-amp-test-amp-landing-pages-for-search/#respond Tue, 23 May 2017 19:21:03 +0000 https://www.bizwhiznetwork.com/?p=17943

At Tuesday’s Google Marketing Next event, Google announced more integrations for AMP for Display and Search advertisers.

For search advertisers, Google is launching a beta to serve AMP-enabled landing pages on mobile from Search ads. After users click a search ad from mobile, the ad will direct to the AMP landing page that can serve up much faster than standard landing pages. When creating ads in AdWords, advertisers will be able to designate an AMP-enabled landing page at the ad level. Google says early testing has shown improved user behavior and conversion outcomes.

On the Google Display Network, Google will automatically convert display ads to AMP. Google has found AMP ads can load up to five seconds faster than standard display ad builds.

Google is currently working on unifying unique users who visit both AMP and non-AMP pages in Analytics. At I/O last week, Google also made several announcements about its push to make AMP the standard for the mobile web.


About The Author

]]>
https://www.bizwhiznetwork.com/google-will-automatically-convert-display-ads-to-amp-test-amp-landing-pages-for-search/feed/ 0
James Bond Star Roger Moore Passes Away At 89 https://www.bizwhiznetwork.com/james-bond-star-roger-moore-passes-away-at-89/ https://www.bizwhiznetwork.com/james-bond-star-roger-moore-passes-away-at-89/#respond Tue, 23 May 2017 19:20:48 +0000 https://www.bizwhiznetwork.com/?p=17942

After a short battle with cancer, James Bond star Roger Moore has passed away at the age of 89.

While the character of James Bond is more than likely the first role that pops to mind when you think of Roger Moore, the versatile actor also had a long television career as well. He was the third actor to take over the role of 007 after Sean Connery and George Lazenby, and played it for seven films in the ’70s and ’80s. Before that however he starred in several TV shows, including Ivanhoe, Maverick and The Saint. It was the role in The Saint where he played spy Simon Templar that got him noticed as a possible replacement for Connery after Lazenby wasn’t able to get the fans behind him. Moore played a different sort of Bond than Connery did, but still brought the suave, sophistication that fans of the super spy liked. He currently holds the record for the longest stint playing Bond, owning the role for 12 years from 1973 to 1985. He was also the oldest when he finally retired from the role, as he was 58 years old. 

Moore’s passing was announced by his children on Twitter. They posted a short note that stated Moore had died after a “short but brave battle with cancer.” They went on to say that “We are all devastated.” They also noted Moore’s work, telling the world that “We know our own love and admiration will be magnified many times over, across the world, by people who knew him for his films, his television shows and his passionate work for UNICEF which he considered to be his greatest achievement.”

Moore was created a Commander of the Order of the British Empire in 1990, and made a Knight Commander of the Order of the British Empire in 2003.

 

 

]]>
https://www.bizwhiznetwork.com/james-bond-star-roger-moore-passes-away-at-89/feed/ 0
Master The Season’s White Evening Gown Just Like A Celebrity https://www.bizwhiznetwork.com/master-the-seasons-white-evening-gown-just-like-a-celebrity/ https://www.bizwhiznetwork.com/master-the-seasons-white-evening-gown-just-like-a-celebrity/#respond Tue, 23 May 2017 19:20:44 +0000 https://www.bizwhiznetwork.com/?p=17941 On the season’s red carpets, stark ivory gowns have been making a splash everywhere. Whether spotted on celebrities or sported by the fashion set, the white evening gown is clearly enjoying a moment. If you’re wondering how to master the look this season, look no further than these gorgeous and chic stars.

Jenna Dewan-Tatum

The 36-year-old dancer and actress looks effortlessly beautiful in this belted all-white frock with its cold shoulders and three-quarter sleeves. Meanwhile, the oversized buttons bring a hint of visual interest to her ensemble. For a finishing touch, Dewan-Tatum’s pointed toe nude pumps prove that neutral accents can be truly classy.

Emma Watson

Not only is the formal white gown stealing the spotlight, it is often spotted paired with an off-the-shoulder neckline. Emma Watson is radiant in this floor-length number, which is detailed with a waist-cinching cut and a thigh-high slit up the leg. For a contrasting effect, the 27-year-old star accessorizes with a crimson-coloured clutch and strappy black heels.

Katherine Heigl

With its elegant long hem and ruched accents, this dress makes Katherine Heigl look like a true lady.  The Unforgettable star is truly an unforgettable sight, in this frock which emphasizes her feminine curves. Meanwhile, shoulder-dusting earrings and a metallic box clutch offer glitz and glamour.

Photos: Instar Images 

]]>
https://www.bizwhiznetwork.com/master-the-seasons-white-evening-gown-just-like-a-celebrity/feed/ 0
Best Looks From The 2017 Cannes Film Festival https://www.bizwhiznetwork.com/best-looks-from-the-2017-cannes-film-festival/ https://www.bizwhiznetwork.com/best-looks-from-the-2017-cannes-film-festival/#respond Tue, 23 May 2017 19:20:43 +0000 https://www.bizwhiznetwork.com/?p=17940 With the Cannes Film Festival currently taking place, several stunning celebrities have descended upon the red carpets in Cannes, France. The annual festival in the French city has revealed everything from metallic gowns to ball gowns suited for a fairytale princess. From Aishwarya Rai to Jourdan Dunn, have a closer look at these celebrity sartorial statements.

Aishwarya Rai

The Bollywood bombshell was a sensational sight in this red strapless gown, which featured a voluminous hem adorned with tiers. Meanwhile, the sweetheart neckline bodice of her frock featured swirls of red embellishment. Not only did Rai manage to match the red carpet in her rich crimson gown, she also chose a colour which complemented her olive skin.

Nicole Kidman

As the star of the comedy How to Talk to Girls at Parties, Nicole Kidman made an appearance at Cannes in a stunning black and white Calvin Klein look. The strappy dress was accented with a full tulle skirt and a form-fitting bodice that enhanced her curves. She accessorized her ensemble with a pair of simple strappy black heels for a finishing touch.

Jourdan Dunn

The 26-year-old English model was feminine and dainty in a sheer blush-coloured frock by Elie Saab. With its cross-strap bodice and delicate pastel embroidery, Dunn’s red carpet look was delicate yet also detailed. The long, pleated hem brought a modest effect to the overall design, while the hint of décolletage completed her style.

Eva Longoria

Staying true to the recurring theme of sheer fabrics and dainty embellishment at Cannes, the gorgeous Eva Longoria stepped out in Marchesa. With its illusion neckline, flowing sheer sleeves and brilliant soft gold hue, the long dress flattered her curves and petite frame.

Photos: Instar Images 

]]>
https://www.bizwhiznetwork.com/best-looks-from-the-2017-cannes-film-festival/feed/ 0
SSDs Vulnerable to Deliberate, Low-Level Data Corruption Attacks https://www.bizwhiznetwork.com/ssds-vulnerable-to-deliberate-low-level-data-corruption-attacks/ https://www.bizwhiznetwork.com/ssds-vulnerable-to-deliberate-low-level-data-corruption-attacks/#respond Tue, 23 May 2017 07:08:38 +0000 https://www.bizwhiznetwork.com/?p=17939

NAND flash

Over the last decade, solid state drives (SSDs) have moved from extremely rare (and expensive) alternatives to hard drives to being the storage option of choice for enthusiasts and mobile users alike. An SSD is one of the best ways to improve the performance of an older system with a traditional hard drive, and costs have fallen below 50 cents per GB. But a new paper from Carnegie Mellon University, Seagate, and ETH Zurich has shown that the way data is programmed into MLC SSDs makes them vulnerable to data corruption attacks, meaningfully reducing the drive’s lifespan in the process. That’s significant, because MLC drives constitute the vast majority of SSDs in-market.

Before we dive into the findings, (PDF) let’s take a moment to define some terms. The first SSDs that were developed stored one bit of data per cell and are called single-level cells (SLCs). These devices are programmed in a single stage and are not vulnerable to the attack methods we’ll be discussing.

The next type of NAND stores two bits of information per cell and is referred to as multi-level cell, or MLC. The “level” refers to the number of voltage states within each cell. MLC devices used to use single-stage programming, but switched to a two-stage programming method below the 40nm process node (the overwhelming majority of 2D planar NAND drives are manufactured below 40nm these days).

Finally, there are triple-level cell SSDs, which store three bits of data in each cell. While this paper does not discuss whether TLC drives are vulnerable to these attack strategies, everything I could find on the TLC programming process suggests that they are, since TLC NAND uses a three-stage programming cycle.

TLCNAND

NAND programming voltages, for SLC, MLC, and TLC.

One bit of good news, however, is that the attack we’re going to discuss does not work against current 3D NAND from Samsung and other manufacturers, yet. Current 3D NAND is single-shot programmed and built on older process nodes (40nm, in Samsung’s case). The study authors expect 3D NAND to return to two-stage programming as process node technology moves below 40nm once again in the future. Modern 2D planar NAND is built at a variety of nodes, defined somewhat differently by each manufacturer. One can assume 20nm as an overall approximation.

How the attack works

The reason that MLC NAND is programmed in a two-stage process below the 40nm process node is to reduce the chance that the voltage changes required to write data into one block of NAND will propagate into adjacent wordlines within the same block. This is known as cell-to-cell program interference and occurs because of parasitic capacitance coupling within the memory block. These two issues are one of the main reasons why 3D NAND became necessary — the smaller the process node the NAND is built on, the more difficult it is to prevent writes to one cell from corrupting data in other cells.

MLC NAND contains two bits of information, or pages — the least significant bit and the most significant bit (LSB and MSB). In the two-step process, the cell is initially set to a temporary state in which its threshold voltage is roughly half of the final value. This temporary LSB voltage is then read into a buffer within the flash chip, then programmed again, moving the threshold voltage to the final state expected for a fully programmed NAND cell. The SSD controller interleaves programming steps of any given cell with the programming steps of adjacent cells to minimize the chance for data corruption.

Figure5

Here’s the problem. The data loaded into the LSB buffer for final MSB programming is loaded directly from the flash cell to be programmed, not the flash controller. This reduces latency and improves performance, since routing the work through the SSD controller would require transferring the data. But it also means that if the data loaded from the LSB buffer has errors, those errors definitionally can’t be corrected by the SSD controller — it never “sees” the data. The authors report that “the final cell voltage can be incorrectly set during MSB programming, permanently corrupting the LSB data” (emphasis original).

Rowhammer’s flash-based cousin

We’ve discussed Rowhammer before — the exploit that corrupts data in RAM by reading and writing to specific parts of DRAM in order to flip bits in target areas. The two-step programming process makes NAND vulnerable to a similar type of attack. Would-be attackers can either exploit cell-to-cell program interference, which exploits parasitic capacitance coupling to introduce errors in adjacent cells, or by using a technique known as read disturb, which repeatedly reads the same set of cells. This can create a weak programming effect that’s capable of flipping the bits of cells that aren’t even being read. Call it Cellhammer, if you like, but the results are the same — corrupted data and damaged SSDs.

Figure3

These types of attacks can meaningfully reduce SSD lifespan by introducing additional errors, as shown in Figure 3, above. According to the research team, none of the various management techniques baked into SSDs are currently sufficient to prevent these attacks, though there are several ways that these attacks could be mitigated and one option that would prevent them altogether.

Table2

The most direct way to stop Cellhammer is to buffer LSB data directly in the SSD controller. This would prevent the attack from functioning by allowing the SSD controller to correct errors in the LSB data, rather than trusting that this data is accurate and contains no errors.

Figure10

This would, however, cause a small increase in latency (the team estimates it at ~5 percent). That seems a modest price to pay in exchange for securing systems against this kind of attack. But it’s not clear if any NAND manufacturers would adopt these policies if they thought it might harm their competitive standing against the rest of the industry. Consumer and enterprise SSDs live and die on their ability to show best-in-class performance.

Then again, this kind of feature could find a home in enterprises, where companies are willing to pay for data security rather than just speed. The recent spate of ransomware attacks and the leak of the NSA’s various hacking tools could put new emphasis on ensuring systems remain secured against a wide variety of attack vectors, including low-level attacks like this one.

Now read: How do SSDs work?

]]>
https://www.bizwhiznetwork.com/ssds-vulnerable-to-deliberate-low-level-data-corruption-attacks/feed/ 0
Bing launches bots for local businesses https://www.bizwhiznetwork.com/bing-launches-bots-for-local-businesses/ https://www.bizwhiznetwork.com/bing-launches-bots-for-local-businesses/#respond Tue, 23 May 2017 07:08:25 +0000 https://www.bizwhiznetwork.com/?p=17938

Bots are coming to Bing in a big way. Through its Bot Framework, Microsoft is starting to integrate chatbots into search results — to make search more interactive and transactional.

In April, Matt McGee spotted the appearance of chat functionality for selected Seattle-area restaurants. That is now rolling out officially (still only to restaurants) through Bing Places and the newly launched Business Bot program. Microsoft will automatically create a bot from the data in Bing Places.

The business doesn’t need to do anything technical. It just answers a few structured questions and accepts the bot agreement terms. Thereafter, when users search for the business, a screen like the following will appear:

Users can then get basic questions about the business answered through the bot (e.g., “do you have outdoor seating?”). If there’s a question it can’t answer, the bot will refer the user to a phone number.

The bot can also ask business owners additional questions, depending on what information users are seeking. The new information will then be incorporated into the data set the bot uses to respond.

There are three noteworthy aspects of this development:

  • The consumer experience: ability to get deeper questions answered and possibly conduct transactions within the SERP
  • The automated creation of the bot using data provided to Bing Places
  • The fact that the bot will be available across channels and platforms (create once, publish across sites)

Local businesses will be able to add “channels” with the click of a button, such as Facebook Messenger and Cortana. Currently, Business Bots is available for Bing, Skype and SMS. Facebook Messenger and Cortana are coming soon.

Microsoft envisions rolling bots out broadly to local businesses. I assume that questions and functionality will need to be tuned by vertical as they go. The company also sees large enterprises deploying bots and chat in search results. Transactional capabilities will be added over time.

Google offers Message extensions for AdWords but doesn’t have comparable chat functionality in organic results — although that will likely come. But with Business Bots, Microsoft has made chatbots (and AI) immediately accessible to SMBs.


About The Author

]]>
https://www.bizwhiznetwork.com/bing-launches-bots-for-local-businesses/feed/ 0