Subject: ME768 : Project Report | Year/Sem: 1999-2000, IInd Semester | Instructor: Amitabha Mukherjee | Author: Apurva Sharma


CONTENTS:
  • Motivation
  • Relation To Past Work
  • Methodology
  • Result
  • Future Work
  • Bibliography/Webinfo

  • MOTIVATION

    Computer technology has dramatically enhanced our ability to generate, deliver and store information. Unfortunately, our tools for locating, filtering, and analyzing information have not kept pace. A popular solution is intelligent agents.

    Agent, here means someone who acts on your behalf. Information agents are loosely analogous to travel agents, insurance agents, etc.

    Software Agent technology has seen a lot of development recently. As a result we see, Search agents, Auction agents, Bidding agents, Chatting agents etc. To quote from (http://www.siliconalleyreporter.com, 23 December 1998),

    Today the e-commerce companies are paying portals a heavy ransom for traffic. In the future, they will pay intelligent agent services (most of which will be owned by the portals) for traffic. The successful portals of the future will combine a trusted brand, strong editorial, and data leveraging services like comparison shopping.
    From my review of a paper on ShopBot, I got the idea about a comparison shopping agent. The importance of Internet agents is stressed by the following quote :
    From the consumer's point of view, it is virtually impossible to find the small set of pages that list a specific product for sale. From the vendor's point of view, it is extremely difficult to attract qualified buyers to their site.
    The job of the proposed "BestPrix" agent would be find out the prices of a particular commodity from a few online vendors and return a comparison list to the user, thereby, assisting him in his purchase.

    Back to contents

    RELATION TO PAST WORK

    My work draws much inspiration from ShopBot. ShopBot was developed a Washington Univ. to demonstrate the development of a domain-independent comparison shopping agent. It is now commercially succeeded by http://jango.excite.com.

    Bargain Finder Agent helps do comparison shopping for music CD's.

    Reel.com suggests movies "similar" to the one specified from a specified period.

    Airfare.com finds the lowest fares in a market before you book.

    Price-Search, is similar to jango. Given a product it searches its resources for the best price to offer.

    Besides, there are a lot of other Software Agents for Internet (called BOTs).

    Back to contents

    METHODOLOGY

    The "BestPrix" agent has the following two problems to solve: Back to contents
     

    RESULTS

    I have been able to implement the basic architecture of the BestPrix agent. As of now I have added support for only Music CD's domain. Two vendors http://www.cdnow.com and http://www.cduniverse.com are included.

    There are things still remaining to be looked into. These shortcomings are mainly due to the hueristics used to reduce the search space and model the domain. To give an example of the possible scenarios not catered by the present agent implementation:

    We talk of the music CD domain itself. If the first search field (key one) is taken as Record Label the results give an altogether different picture. There is an extra level of indirection added. The results give a list of Albums offered. One has to follow these to reach the actual page from where it can obtain the price and other things. The current agent cannot do so, simply because its model does not include it.
     Back to contents
     

    FUTURE WORK

    Future work in this are can take two directions. One possibility is to strengthen the model to include more complex scenarios. But I believe this would complicate the design of the agent and adversely affect its efficiency and simplicity of use. That was the reason I rejected suggestions like using a dictionary, word stemming, human administered learner etc.
    Other direction is related to the current progress in XML. As the trend shows future websites are going to be based on XML. XML makes it easy for automated agents to parse and understand the structure of the websites. So if somebody can design a DTD and an associated automated agent capable of understanding it and if several vendor's use that DTD to create an image of their website compliant with the agent, web shoppers can use this agent to automatically obtain the best prices for their chosen products and maybe actually purchase using the agent.

    Back to contents

    BIBLIOGRAPHY/WEBINFO

    Bibliography

    @Article{Doorenbos/Etzioni/Weld:1997,
      author=      { Doorenbos Robert B., Etzioni Oren, Weld Daniel S. },
      keywords=    { AGENTS WWW },
      institution= { UWASH-CSE },
      title=       { A Scalable Comparison-Shopping Agent for the WWW },
      journal=     { Autonomous Agents },
      year=        { 1997 },
      e-mail=      { bobd@cs.washington.edu, etzioni@cs.washington.edu, weld@cs.washington.edu },
      url=         { ftp://ftp.cs.washington.edu:21/pub/etzioni/softbots/agents97.ps },
      annote=      {
                     This paper describes a domain-independent comparison-shopping
                     agent named ShopBot. Given the home pages of several online
                     stores, ShopBot autonomously learns how to shop at those
                     vendors. After learning, it is able to speedily visit a dozen
                     software and CD vendors, extract product information, and
                     summarize the results for the user. ShopBot achieves this
                     without sophisticated natural language processing, and requires
                     only minimal knowledge about different product domains. Instead
                     ShopBot relies on a combination of heuristic search, pattern
                     matching, and inductive learning techniques. ShopBot is unique
                     in its ability to learn to extract information from the semi-
                     structured text published by Web vendors.
                     The most important regularity exploited is that vendors
                     structure their store fronts for easy navigation and use a
                     uniform format for product descriptions.
                     Major limitations are that it is limited to stores that
                     provide a search-able index, and, relies heavily on HTML.
    
                     The paper clearly describes the heuristics used at different
                     stages with justification. Results of trials of the ShopBot
                     are also provided which support the authors claims about the
                     utility of the software. }

    WebInfo

    Back to contents