Hey! This thing is still a Work in Progress. Files, instructions, and other stuff might change!

Star Wars Land Speeder Scan #1

by clothbot, published

Star Wars Land Speeder Scan #1 by clothbot Apr 29, 2011


Use This Project

Give a Shout Out

If you print this Thing and display it in public proudly give attribution by printing and displaying this tag.

Print Thing Tag

Thing Statistics

6387Views 615Downloads


My first attempt at taking a set of photos and sending them thru http://my3dscanner.com to generate a point cloud.

I removed the windshield for scanning.


  • the reflections off the semi-glossy black surface show up as "deep" points right under the land speeder.
    • lesson: matte surfaces are better
  • shadows make for empty zones, particularly under the rear jets.
    • lesson: lighting matters
  • closeup photos don't help
    • lesson: take more photos at a distance instead of up close.

...much of which is captured in the http://www.my3dscanner.com/index.php?option=com_k2&view=item&id=5:general-scanning-guide&Itemid=59 - General Scanning Guide, it's just interesting to see how not following those tips affects the final result.


Step 1: Download the .ply dataset.

Step 2: Open it in Meshlab.

Step 3: Do stuff to it.

Probably follow @tbuser's lead in the comments of http://www.thingiverse.com/thing:7950

All Apps

Upgrade this Thing with Thingiverse Apps

Tools and Utilities

Repair, slice, or enhance this Thing

Auto-magically prepare your 3D models for 3D printing. A cloud based 3D models Preparing and Healing solution for 3D Printing, MakePrintable provides features for model repairing, wall thickness...

App Info Launch App

Also, I think it helps to have a textured background instead of a smooth uniform color. At least I find it helpful when I mesh it. Meshlab's poisson filter tries to turn the whole scene into one object, so if there's big open areas with no points, it kind of wraps the object around an imaginary sphere (which is sometimes inside out) and makes it harder to cleanup.

Dear owner of clothbot,

Ur bot sez u need better scans!

Seriously... I used to take photography for 3d point cloud reconstruction via photosynth... your advice is GOOD advice, here's what I can add to it:

Take photos a minimum of every 30 degrees. 15 degrees is better if you have the time.

As you mention, glossy is b
ad. It tries to treat the reflection as if it were a pigmentation on the surface, and fails pretty miserably. For something like this, cover the shiny bits with baby powder or something along those lines. It's ok if it's a little spare or chunky; that'll actually give the algorithm extra bits (co
rners) to latch onto - in the algorithm, a corner is any set of a few pixels with a big gradient in them - it tries to transform these corners from one photo to the next to derive the point cloud.

Likewise, contrast is good. If you could repaint the area behind the seats white, for instance, they'
d show up better.

This worked in photosynth, not sure how it'll work on my3dscanner... to get detail shots of something, zoom by no more than 30% per shot. In other words, more than half of the zoomed in shot should be visible in some other picture for it to align the camera state data properly.
In fact, that was the general rule for EVERY picture in photosynth... every shot should be more than 50% similar to at least one other shot...

Shiney objects can also be dulled with Magnaflux developer (SKD-S2), for checking for cracks in engine blocks, it is acetone based (that evaporates real fast) and the powder is easily washed off with water, rain, sleet and sometimes, dark of night . . .

Thanks for the tips!

I left the dust on this model so it wouldn't be as bad as a cleaned surface, but it wasn't nearly as effective as I'd hoped. ;-)

I took 72 pictures for this dataset (shutter speed of 5 seconds each), of which only 40 were reported used. A lot of the "holes" do correspond to areas where I was zoo
med in closer. I tried taking sequences of progressively zoomed-in pictures in case they helped increase effective resolution.

In hindsight, I didn't position the model high enough to get quality ground-skirt-level pictures.

Once I've honed my skill on the small to moderately sized stuff I'll try
taking some trips out to a couple of local museums with much more interesting targets. :-)


Great attempt!

May I make 3 statestments for discussion purposes only?

1.Objects created by simple CAD modeling are probably not worth scanning. They can be much easier and more accurately replicated by simple measuring and reverse CAD modeling. Photobased scanning is good for man-made organic forms which are
difficult to replicate with basic CAD shapes.

2.CAD designed and machine made objects usually have very little texture (they are mostly made of plastic or metal and smoothly painted). They have few surface points unlike man-made objects. http://My3dscanner.comMy3dscanner.com cannot digest non-textured objects.

3.We hav
e a 3D collection of 388 coffee mugs, 78 staplers, 98 keyboards, 46 space rockets, 19 humanoids and 29 flying plates at http://www.my3dscanner.comwww.my3dscanner.com

Is Thingiverse community really interested in seeing more of those?

Is Thingiverse community really interested in seeing more of those?

Yes! I don't know if I need 388 different coffee mugs, but it sure would be cool if I could call up a 3D model of every toy ever made. :-D

Very valid points. I chose this particular object because it's a fun example and easy to measure against for model verification.

It's just complex enough with the inner and partially-hidden surfaces that it makes figuring out the workflow from photo to 3D print a more interesting and useful challenge (for me at least).

In this particular case it's debatable whether it ever had any CAD origins. It's a 30 year old toy I boug
ht new when I was a kid. :)