The biggest issue right now turns out to be sensor and display technology. Knowing when to occlude something means knowing how far away it is, and how far away everything else is too, and that takes detailed depth-sensing--something that's still quite difficult. Depth-sensing IR cameras are still fairly low-resolution, but they're getting better, and so is software that can use color information to infer depth. Put these together, and we may soon get somewhere believable, and maybe even useful.
For now, we're playing around with the design opportunities that more believable AR could offer, like grabbing a virtual object and manipulating it, or even turning it into a menu or tool element. It's still early days, but even this quick round of experiments has opened up possible futures that go way beyond a Snapchat filter.