Click to View more about NSF Scholar Awards to support conference attendance here.

Click to View Advanced Program: here.

For reservation please call 1-888-627 8561 or click here.

Please note that there are *two* Westin hotels in San Francisco. Please make sure that you make your reservations at the correct hotel here.

Call for Challenges

The ISBI 2013 Conference is soliciting proposals for scientific challenges. The aim of the challenges is to accelerate the pace of research hosting quantitative comparisons of competing approaches on common datasets.

Each challenge needs an organizer(s), who will be responsible for providing training and testing data, defining the tasks, specifying the metrics, managing entries, and conducting the studies and on-site presentations. The format of each challenge is up to the organizer(s) and should be detailed in the proposal. Live, on-site evaluations are encouraged. There should be presentations by the top scoring teams. ISBI will assist in advertising challenges, but the organizer(s) is ultimately responsible for challenge participation and success.

Challenges will be held on the last day of the ISBI conference, Thursday, April 11th. Each challenge will be allocated to a half-day session.

Important Challenge Dates

Challenge Proposal Deadline: November 1, 2012
Notification of Acceptance: November 5, 2012
Challenge Announcements (Tasks, Data, and Metrics): November 25, 2012
Summary of Challenge Participants: February 1, 2013

How to Submit Challenge Proposal

To propose a challenge, a PDF file containing the following information must be sent to the ISBI 2013 Contest Co-Chairs:

  • Stephen Aylward (Stephen.aylward(at)Kitware.com)
  • Bram van Ginneken (b.vanginneken(at)rad.umcn.nl)

All submissions will be acknowledged by email. Please include the following in the PDF:

  1. Contest title and abstract
  2. Name and contact information of the main organizer and at least 2 other expert committee members
  3. General description of the problem
  4. Description of the dataset to be used and allocation to training and testing
  5. Description of the actual competition tasks
  6. Evaluation metrics
  7. Plan of how to organize the contest
  8. Estimated number of participants
  9. Other points that should be taken into account are as follows:
    • The dataset should be interesting, available, and sufficiently large.
    • The tasks should be interesting and novel, but also accessible without too much domain specific knowledge.
    • The evaluation metrics should be clear and easy to apply. Supplying implementations in C++ or Python is ideal.

Reference Material

The following links are provided for convenience. The services provided are not specifically endorsed by ISBI.