There were more good items in the Best Biz Web Newsletter this month. This newsletter is available for free but you must have a subscription. If you have any interest in business resources, sign up now at Best of the Business Web. When you visit, check the blog – Thinking Out Loud – for thoughtful postings by Robert Berkman on the research process.
Of interest to me in the June newsletter were:
CORE – Connecting Repositories — aggregates open access research outputs from repositories and journals worldwide. CORE provides “services for different stakeholders including academics and researchers, repository managers, funders and developers”.
Lies, Damn Lies and Viral Content at TOW Center for Digital Journalism that describes and links to a report by Craig Silverman on “How News Websites Spread (and Debunk) Online Rumors, Unverified Claims and Misinformation.” Beware the viral story.
Journalists today have an imperative—and an opportunity—to sift through the mass of content being created and shared in order to separate true from false, and to help the truth to spread. This report includes a set of specific and, where possible, data driven recommendations for how this anti-viral viral strategy can be executed.
Dan Russell provides strategies and tactics of what to do when “you need to learn about a topic area very quickly”. Start with thinking about the “domain” of interest – subject area / topic, and “frame” your question – you don’t want to know everything about the topic. He reminds readers that a librarian can help in shaping the question and in selecting resources – to which I add, you may be able to get that advice through online chat with your public library. But, you can also interview yourself – and his worked examples show the type of scans you can do of online resources.
To these I would add tools that can help in clarifying the question (disambiguate). Today, Duckduckgo is the main search engine that can help with pointing out aspects of topics.
Subject directories used to be excellent tools but today are either closed or poorly maintained. But it can still be beneficial to browse the subject tree – such as at the Toronto Virtual Library, or (dare I say), Open Directory Project for its categories.
Research by BlueNile into search practices shows an even split in the nature of queries, where half are fragments (or phrases – just 2 or 3 words) and the other half fully expressed queries. Another cut showed queries that were statements vs those that were done as questions. The study claimed to find that searchers have distinct approaches, but I think it’s more likely that searchers vary their approach depending on their knowledge and need.
Psychology of the Searcher Nathan Safran, BlueNile Research (Apr 28)
Jeremy Gottlieb gives some pointers on using search operators at Google for researching competitors in the SEO industry.
Competitor Research Using Search Operators | A Launch Point For Beginners, distilled (May 21)
- searching for keywords in the title – using intitle:
- limiting the search to a particular site – using site:
- limiting the search to part of the url – using inurl:
All are good, but you may run into problems with inurl. Repeated use of inurl: triggers Google’s anti-hacker system that will drive you crazy with captcha to prove you are human.
Sometimes searchers outside of the United States want google.com and not their country version. Here is a workaround in Chrome to provide the option to search Google.com. Could use a similar approach in Firefox.
How to Force Google Chrome to Use Google.com Instead of Country Specific Version, Jennifer Slegg, TheSEMPost (May 19)
One by one Google strips itself of the features that made the Google search engine excellent for web search. This time it’s the reading level search filter. Presumably it wasn’t used much, but that is hardly a good reason – not when it was a feature valued by a key segment of the user population.
Google Drops Another Search Filter: Reading Level, Barry Schwartz, Search Engine Land (May 8)
Karen Blakeman, a professional searcher, is one who feels the loss. She commented in – Google dumps Reading Level search filter that feature helped to separate the technical, serious articles from “consumer or retail focused pages” – which I think we could call the trivial. She wonders, as do I, which of the few remaining advanced search features Google will drop next. Pray that it won’t be number range.
These reflections on searching by Dan Russell are on the mark for working with words. See Searching within a document, human memory, and other things you’d like Google search to do (May 7)
Site: is one of the most useful search operators at Google, Bing, Yahoo, Duckduckgo. It permits searching the pages indexed by the search engine at that web site. Form is site: followed by the domain name – eg site:utoronto.ca – for everything in the University of Toronto’s domain. There are refinements – may search a subdomain – eg ischool.utoronto.ca, or a subdirectory – eg utoronto.ca/research.
This video prepared by the Google Media team is very good – clearly describes how and when to use site: through good examples. Find the video and Dan Russell’s introduction at Search Research — A new video on SITE: from Google’s Media team (May 4)
Of interest – Google Media Tools has many more helpful tools and tips.
The thrill that people in the search business used to have in showing others the power of Google has gone. Instead, there is alarm or dismay over Google’s dominance. This shows in Karen Blakeman’s slideshow presentation – New Google, New Challenges (April 2015) in which the presentation begins with a review of Google’s clashes with Spanish newspapers, the EU, France over the search algorithm. There is a growing view that Google oversteps and oversells. Its current direction may be artificial intelligence – which may improve search and definitely unnerve us more. At slide 44, Blakeman recaps useful syntax at Google. There aren’t many alternatives for broad web search – pretty well reduced to Bing and DuckDuckGo. Millionshort.com is on Blakeman’s list, but my experience has been that the search engine is often overloaded. Presentations closes with some tools and techniques for searching social media.
Matthew Gilley, at Press Gazette, has extracted Ten ways for investigative journalists to check their facts when using search engines and social media, (Apr 16) from a new book – The Verification Handbook for Investigative Reporting
The first two tips are about using search engines about which we could say much more. Basically – learn the syntax and have several search engines at hand. I recommend using DuckDuckGo from time to time, and to being diligent in searching out specialty sites.
The next eight tips concern analysis of what you find – check and check again. There’s considerable mention of User Generated Content – or UGC – which will be helpful to journalists.
“User-generated content (UGC), like Youtube videos, can be useful to investigative journalists, but should also be treated with caution. If a video claims to show a certain place, a journalist can find satellite imagery of that location. Comparing landmarks (or road layouts) in the video and the satellite image can establish whether the video was indeed filmed where it is supposed to have been, and even the exact position of the camera.”
This is the second of two books with contributions from journalists, editors, and researchers on verification techniques and tools. Both are available online for viewing, downloading, or purchase. Must reads!
- The Verification Handbook: A definitive guide to verifying digital content for emergency coverage with ten chapters of tools and techniques.
- Verification Handbook for Investigative Reporting: A guide to online search and research techniques for using UGC and open source information in investigation with ten chapters and three case studies.