Presented by:

Dan Garcia

from UC Berkeley

Dan Garcia is a Teaching Professor in the EECS department at UC Berkeley. He was selected as an ACM Distinguished Educator in 2012 and ACM Distinguished Speaker in 2019, and is a national leader in the "CSforALL" movement, bringing engaging computer science to students normally underrepresented in the field.

Thanks to four National Science Foundation grants, the "Beauty and Joy of Computing (BJC)" non-majors course he co-developed has been shared with over 800 high school teachers. He is delighted to regularly have more than 50% female enrollment in BJC, with a high mark of 65% in the Spring of 2018, shattering the campus record for an intro computing course, and is among the highest in the nation! He is humbled by the international exposure he and the course have received in the New York Times, PBS, NPR, and others media outlets. He is working on the BJC Middle School curriculum.

No materials for the event yet, sorry!

The Apple Vision Pro (AVP) has been called a “game-changing device”, allowing for a seamless blending of an augmented reality world locked in 3D space above your surroundings. Having explored coding in Snap! with the AVP, with the blocks floating above my table, and coding as simple as moving and pinching in space, it got us thinking what the experience might be like 10 years from now:

  • Right now, Snap! on the AVP exists only within a Safari browser, but all the most immersive and intuitive AVP experiences are stand-alone apps designed specifically for the affordances the AVP provides. The Scratch foundation did this with “ScratchJr” on the iPad and iPhone.
  • There is an incredible amount of "3D real estate" with the AVP – enough that Snap! could have all blocks visible at once (i.e., not hidden behind tabs), but floating in 3D space like a rolodex, perhaps, or side-by-side to the left and wrapping around a person.
  • Selecting a paint editor could select Apple's built-in Freeform drawing app (or any other app a user wanted, like Photoshop). Similar for the sound editor.
  • Everything done with gestures (what the mouse currently does) could also be accomplished by voice and looking at things – the voice equivalent of “keyboard input”.
  • AI will probably support us in most of our everyday tasks ten years from now, so it's natural to think that there will be an AI assistant helping us code. It might point out potential bugs before the green flag was clicked, suggest the next block based on what we're doing, or even author small utility blocks for us (if we asked for it and provided a clear spec).
  • The "stage" might extend to be a real "3D stage", with sprites available in 2D (as transparent PNG textures on always-facing-the-user polygons ala Doom) 3D. This might allow us to connect with the Alice user community and their resources. At the very least (even if we stuck with a 2D stage), we would be able to “turn” the stage to see what sprites were in front of and behind each other to debug a complicated scene where things were hidden.
  • We might see some of the features users have dreamed about implemented: collaborative editing ala Google Docs, the ability to select a group of blocks (ala Illustrator) and delete/move/duplicate them, auto-save with a “history” feature that could roll back to any previous version, etc.

In summary, it's fun to dream about what life in Snap! will be like years from now, and this lightning talk will paint a picture of one such future.

Duration:
7 min
Room:
Online Room 1
Conference:
Snap!shot 2024
Type:
Lightning Talk