Pithy Advice for Lab Members

From Healthcare Robotics Wiki
Revision as of 04:20, 25 November 2009 by Cckemp (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
  1. Check in your code. Try to check in your code on a daily basis and informatively document your check in. "svn log" should be useful.
  2. Don't be shy.
  3. Add to the lab wiki.
  4. Use open source, when possible.
  5. Be open, when possible.
  6. Look for publication and funding opportunities.
  7. Pay attention to aesthetics. In robotics, style is important. In engineering and science, beauty often has value.
  8. Close the loop! (many corollaries)
    1. Implement and implement often. -- Rod Brooks
    2. Avoid premature optimization.
    3. Focus on the system.
    4. Without implementing a sketch of the whole system, you are doomed to fall into a local minimum.
  9. How will you know you've succeeded? What are your hypotheses? What will be the measurable results?
  10. Who cares? What is your contribution? Who will cite your work?
  11. Beware of intra-lab competition. Winning will beget rewards for everyone. Research is not a zero sum game.
  12. Prototype in Python. If necessary, profile and write targeted C++ code.
  13. Visualize your results, whenever possible.
    1. It will help improve your intuition and understanding.
    2. It will help communicate your results.
    3. It will often reveal bugs in your thinking, hardware, and software
  14. Video your progress! Robots are finicky. What works one day may not work the next, especially when a robot is under active development. At each point of progress you should video the robot working, so that you always have something to show. Video it now - it may be your last chance...
  15. If it doesn't work on a real robot, it doesn't work. We are a robotics lab that is focused on real robots operating in the real world. Simulation is a very useful tool, but success in simulation does not imply success on a real robot. Always seek ways to validate your work on a real robot operating under realistic conditions.
    1. Simulations are doomed to succeed. -- Rod Brooks
  16. Use off-the-shelf products and standard methods when possible.
    1. There is no credit for re-inventing the wheel, but there is a high cost.
    2. Using your own methods when suitable methods have already been published in the community can distract people from the point of your research, and make the communication of your results more difficult.
    3. When writing an academic paper, it is wonderful to be able to write something like, "We used method X as published by Y et al. in Z". This simplifies the writing, demonstrates your scholarship, and helps reinforce your membership in the research community. Researchers are also very susceptible to flattery. Citations will win you friends.
  17. Look before you leap. Before heavily investing in a direction of research, look for related work. Search engines and online publications make this easier than ever.
    1. If you look before you invest significant resources, finding related work is a joy. You can build on the work, and it's much easier to read about something than to develop, test, and write about it yourself.
    2. If you look after you invest significant resources, finding related work is terrifying. In this situation, you will not want to see the results of your investment diminish in value. You will be filled with dread as you try to motivate yourself to find related work and draw connections to it. It will be tempting to perform a poor search and state, "I didn't see anything related to my work!"
  18. Beware of tweaking. Parameter tweaking can become a bad addiction and can be a sign that there is a better way to do things.
    1. Collect a large set of data so that you can understand how a parameter change will influence the statistics of the performance rather than a single example case.
    2. Look for ways to have the machine automatically search the parameter space for good settings. This will also help you define success objectively.
    3. Long test cycles (change parameters -> test system) quickly add up. Be very careful.
    4. Look for additional methods that will reduce the system's sensitivity to the parameters.
    5. Remember that in the general case, the search space goes up exponentially with the number of parameters.
    6. Try to reason about the parameters to justify your settings, rather than blindly tweaking. Make assumptions, do some math, etc..
    7. You should say "that's about as well as it's going to work" and move on, rather than searching for a local optimum. Research is not about finding local optimums. That's what industry is for.