kinect point cloud

    Newer Older

    just using the saturation value (close range). manually aligning the texture to the depth, which isn't really right: it needs to be projected onto the depth map.

    note that this uses hector's default 'gamma' function, but what really needs to happen is taking the inverted reciprocal 1/(1-x) as i'm pretty sure the 11-bit image returned by the depth sensor is a disparity image. edit: i just posted a function for converting to meters here github.com/OpenKinect/openkinect/wiki/Imaging-Information

    if you want to see this in realtime, swap out the glview.c from your copy of the github openkinect project with the glview.c posted on the OF board: www.openframeworks.cc/forum/viewtopic.php?f=14&t=4947...

    MattL, filipx, Daniel-Brown, and 13 other people added this photo to their favorites.

    1. design io 42 months ago | reply

      thats insane you could pull that out of my screenshot :)

    2. Kyle McDonald 42 months ago | reply

      i grabbed the raw png, it's looking good :)

    3. design io 42 months ago | reply

      want some video?
      :)

    4. Kyle McDonald 42 months ago | reply

      yes please :) ideally with some big movements and maybe even moving the camera around :)

    5. danomatika 42 months ago | reply

      now thats what I'm talking about ...

    6. Jamie Dubs 42 months ago | reply

      Oh shit! House of Cards mode!

    7. chickenssaur 42 months ago | reply

      what are those 2 bigs parallel lines?

    8. Kyle McDonald 42 months ago | reply

      those are glitches i accidentally added due to manually cropping the depth image: a little white along the border shows up as a line.

    9. mr.russ 42 months ago | reply

      Holy crap! Does this mean we will be able to create 3d models with it?

    10. Kyle McDonald 42 months ago | reply

      definitely, using a few scans and then some post processing in meshlab or blender you can align, mesh, and zipper them. not easy, but significantly easier than before.

    11. mr.russ 42 months ago | reply

      No doubt some clever cookie will cook up something. It will be a fine day when I can scan a component into solidworks or the like using a kinect. No idea how much commercial scanners cost though.

    12. Tiago Serra 42 months ago | reply

      Very impressive point cloud for close range..

      mr.russ: something like what Kyle was talking about: vimeo.com/7489094 ;)

    13. Kyle McDonald 42 months ago | reply

      i have one of those, in black, sitting behind me (donated by makerbot labs) :)

    14. mr.russ 42 months ago | reply

      I have seen that video before Tiago when I was looking at turning a photogrammetry generated pointcloud into a mesh. :)
      Ah 3D printers & scanners, so much fun to be had. It really is an engineers world ;P.

    15. Tiago Serra 42 months ago | reply

      Hopefully it will not be an engineers world when we all have 3dprinters and scanners at home, but we're getting there ;)

    16. Tiago Serra 42 months ago | reply

      Kyle, just curious, Kinetic is awesome alright, I had to buy one :1, but have you considered implementing other photogrammetry methods like using different light sources for a better textured face scanning like this?
      vimeo.com/14028931

      Not always perfection is the most creative thing to accomplish though....

    17. Kyle McDonald 42 months ago | reply

      i have considered some other methods like this but they generally involve complex LED lighting rigs that are expensive or require lots of calibration. i'm more interested in the most DIY stuff that doesn't require much in the way of expertise or resources :)

    18. Tiago Serra 42 months ago | reply

      Agree, thanks :)

    keyboard shortcuts: previous photo next photo L view in light box F favorite < scroll film strip left > scroll film strip right ? show all shortcuts