{"id":1058,"date":"2007-09-14T02:03:09","date_gmt":"2007-09-14T02:03:09","guid":{"rendered":"\/ifbookblog\/?p=1058"},"modified":"2007-09-14T02:03:09","modified_gmt":"2007-09-14T02:03:09","slug":"visual_search","status":"publish","type":"post","link":"https:\/\/futureofthebook.org\/blog\/2007\/09\/14\/visual_search\/","title":{"rendered":"visual search"},"content":{"rendered":"<p>I just came across <a href=\"http:\/\/oskope.com\/\">oSkope<\/a>, a snazzy new &#8220;visual search assistant&#8221; built by a Zurich\/Berlin outfit that allows you to graphically browse items on Amazon, ebay, Flickr or YouTube. More than a demo or prototype, it&#8217;s a fully functioning front end to the search engines of the afore-mentioned sites. I played around a bit in Amazon mode&#8230; below are some screenshots of a search for &#8220;Kafka&#8221; in Amazon&#8217;s book category. Each search cluster can be displayed in five different configurations (grid, stack, pile, list and graph), re-scaled with a slide bar, or rearranged manually by dragging items around. Click any cover and a small info window pops up with a link to the Amazon page. You can also drag items down into a folder for future reference. Very smooth, very tactile.<br \/>\nGrid:<br \/>\n<img loading=\"lazy\" decoding=\"async\" alt=\"oskopegrid.jpg\" src=\"\/blog\/archives\/oskopegrid.jpg\" width=\"500\" height=\"316\" \/><br \/>\nStack:<br \/>\n<img loading=\"lazy\" decoding=\"async\" alt=\"oskopestack.jpg\" src=\"\/blog\/archives\/oskopestack.jpg\" width=\"500\" height=\"316\" \/><br \/>\nPile:<br \/>\n<img loading=\"lazy\" decoding=\"async\" alt=\"oskopepile.jpg\" src=\"\/blog\/archives\/oskopepile.jpg\" width=\"500\" height=\"316\" \/><br \/>\nList:<br \/>\n<img loading=\"lazy\" decoding=\"async\" alt=\"oskopelist.jpg\" src=\"\/blog\/archives\/oskopelist.jpg\" width=\"500\" height=\"316\" \/><br \/>\nGraph (arranges items along axes of price and sales rank):<br \/>\n<img loading=\"lazy\" decoding=\"async\" alt=\"oskopegraph.jpg\" src=\"\/blog\/archives\/oskopegraph.jpg\" width=\"500\" height=\"316\" \/><br \/>\nA few months back I linked to <a href=\"\/blog\/archives\/2007\/06\/visual_amazon_browser.html\">another visual Amazon browser<\/a> from TouchGraph that arranges book clusters according to customer purchase patterns (the &#8220;people who purchased this also bought&#8230;&#8221;). I&#8217;m still waiting for someone to visualize the connections in the citation indexes: create a cross-referential map that shows the ligatures between texts (as pondered <a href=\"\/blog\/archives\/2007\/03\/amazon_starts_to_close_the_loo.html\">here<\/a>). Each of these ideas is of course just an incremental step toward more advanced methods of getting the &#8220;big picture&#8221; view of digital collections.<br \/>\noSkope, though it could still use some work (Flickr searching was unpredictable and didn&#8217;t seem to turn up nearly as much as what I&#8217;m sure is in their system, Ebay wasn&#8217;t working at all), is a relatively straightforward and useful contribution &#8211; ?\u009dmore than just eye candy. It even helped me stumble upon something wonderful: a recently published <a href=\"http:\/\/www.amazon.com\/Kafka-Robert-Crumb\/dp\/1560978066\/ref=si3_rdr_bb_product\/103-7259321-9883052\">study<\/a>  (appropriately, visual) of Kafka, a collab between comic artist R. Crumb and Kafka scholar David Mairowitz.<br \/>\nBrowsing graphically is often more engaging than scanning a long list of results, and a crop of new tools &#8211; ?\u009d<a href=\"http:\/\/www.librarything.com\/\">LibraryThing<\/a>, <a href=\"http:\/\/www.shelfari.com\/\">Shelfari<\/a>, <a href=\"http:\/\/www.delicious-monster.com\/\">Delicious Library<\/a>, and now <a href=\"http:\/\/booksearch.blogspot.com\/2007\/09\/my-own-library-on-book-search.html\">Google Books<\/a> &#8211; ?\u009dhave recently emerged to address this, all riffing in similar, somewhat nostalgic ways on the experience of shelves (Peter Brantley just blogged <a href=\"http:\/\/radar.oreilly.com\/archives\/2007\/09\/shelf_view.html\">another idea in this vein<\/a>). iTunes too has gotten in on this, its album cover flipper becoming a popular way to sift through one&#8217;s music collection.<br \/>\nPerhaps it&#8217;s telling, though, that these visual, shelf-inspired browsing tools are focused on old media: books, albums&#8230; all bounded objects. You couldn&#8217;t simply graft this onto web search and get the same effect (although page previews, of the sort that <a href=\"http:\/\/www.snap.com\/\">Snap<\/a> provides, are becoming increasingly popular). For vast, shifting collections of unbounded, evolving, recombining, and in many cases ephemeral media, different vizualization tools are most likely needed. What might those be?<br \/>\n(oSkope link via <a href=\"http:\/\/infosthetics.com\/archives\/2007\/09\/oskope_visual_search_shopping.html\">Information Aesthetics<\/a>)<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I just came across oSkope, a snazzy new &#8220;visual search assistant&#8221; built by a Zurich\/Berlin outfit that allows you to graphically browse items on Amazon, ebay, Flickr or YouTube. More than a demo or prototype, it&#8217;s a fully functioning front end to the search engines of the afore-mentioned sites. I played around a bit in [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[60,223,1062,1968],"tags":[3009],"class_list":["post-1058","post","type-post","status-publish","format-standard","hentry","category-amazon","category-browsing","category-library","category-visualization","tag-visualization-amazon-browsing-library"],"_links":{"self":[{"href":"https:\/\/futureofthebook.org\/blog\/wp-json\/wp\/v2\/posts\/1058","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/futureofthebook.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/futureofthebook.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/futureofthebook.org\/blog\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/futureofthebook.org\/blog\/wp-json\/wp\/v2\/comments?post=1058"}],"version-history":[{"count":0,"href":"https:\/\/futureofthebook.org\/blog\/wp-json\/wp\/v2\/posts\/1058\/revisions"}],"wp:attachment":[{"href":"https:\/\/futureofthebook.org\/blog\/wp-json\/wp\/v2\/media?parent=1058"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/futureofthebook.org\/blog\/wp-json\/wp\/v2\/categories?post=1058"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/futureofthebook.org\/blog\/wp-json\/wp\/v2\/tags?post=1058"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}