Omniscient search: not quite the panacea it is held out to be

Britain's Guardian newspaper has a piece on "How bots are taking over the world"? The authors note that a shocking 70% of all trades on Wall Street are "automated" with complex algorithms driving them. They  quote technologist Kevin Slavin as saying we live "in an algo-world" and novelist Daniel Suarez who describes life as being a "bot-mediated reality" as evidenced by the fact that "automated softwares perform the analysis of medical x-rays to find abnormalities, while risk-assessment algorithms decide a person's suitability for a credit card based on their financial history."  Our lives, as they observe "are in their hands, if indeed they have anything resembling hands."

Almost as automated is how we plumb for responses to queries that affect concern any and all aspects of our lives from hotel stays to health issues. And the reflex mechanism is to "google" it with the search giant the overwhelming favorite that has stared down a legion of search interlopers from Ask Jeeves to Bing. Enter another giant: Facebook which according to a Bloomberg Businessweek article is quietly delving deeper into the $15bn search market. FB's approach is slated to be quite different to that of Google with its secret sauce of algorithms scouring the web for "relevant" content and, instead, will try and capitalize on its primacy in the social network sphere. For instance, "Facebook’s wine-loving users might be able to query the closest wineries that have been liked most often." The allure of keeping users within their site rather than go to Google or Microsoft partner, Bing likely will be too great.

That neither Google or Facebook in its soon to be search-equipped avatar offers a fully human-human interchange is brought up the authors of the Guardian article when they note that over half of those "clicking through our websites and profiles are not human". They point out that "Bots (internet robots), like any other scientific innovation, can be used for benign or malign purposes." That aspect is made acutely clear to businesses and individuals who are continually subject to cyber crime attacks not to mention false data that affects business decisions at all levels.

A recent study by Incapsula, a cloud-based "Web Application Firewall" found that over 51% of web traffic to a site is comprised of non-humans including hackers, spammers and other malevolents, 5% of whom are "scrapers" who indulge in things reverse-engineering of pricing and business models post the website content of those attacked on to other sites. Among the most commonly targeted victims of scrapers:the travel industry. Its consequences for revenue and crm seem quite self-evident.

Published by

Vijay Dandapani

Co-founder and president of a New York based hotel company for 24 years. Grew the firm to five hotels in Manhattan and also developed a greenfield project at MacArthur airport, New York. Speaker at numerous prestigious forums including Economy Hotels World Asia, Lodging Conference, NYU, Columbia University Real Estate Roundtable, Baruch College's Zicklin School and ALIS. President and ceo of New York City Hotel Association since January 2017.