Q&A - Generating ideas in VR with Gravity Sketch
17 January 2017
DEVELOP3D met up with Gravity Sketch to find out more about the future of VR in design, and its plans to take our ideas from a 3D thought to a 3D representation with no steps inbetween
Describing itself as an ‘intuitive multi-platform 3D creation tool’, Gravity Sketch has made great progress since emerging as a start-up in early 2016 and making one its first public outings at DEVELOP3D LIVE.
With a public Beta now available we caught up with the team to get their views on VR’s adoption into design tools, and how Gravity Sketch fits into this entire ecosystem for creation.
“We see the world moving towards a fully integrated three-dimensional workflow: The majority of ideas and work we are involved in have a 3D component yet we have trained ourselves over the past few thousand years to represent our 3D ideas in 2D mediums,” opens Gravity Sketch co-founder Oluwaseyi Sosanya.
“This is the first time in history that we can bring our ideas from mind to reality three dimensionally; technology has finally caught up and with the power of the smartphone we can all access our ideas from a 3D thought to a 3D representation with no steps in between.”
Why should designers be paying attention to advancements in VR?
As designers and engineers we have been waiting for more intuitive expressive tools especially in the digital design space. There have been loads of effort put into creating hardware that allows for more of a human touch but nothing has really taken off or has come close to eclipsing the mouse.
What AR and VR allows is the complete removal of the perspective interpretation our brains must do when we work in 3D through a 2D screen. With the HTC Vive and Oculus we now have the hardware to build a completely new design experience on top of. This is huge for the design space.
Users with slim to no CAD training will become proficient in digital 3D design, effectively leapfrogging the entire classical CAD training experience.
This can only happen with the right user experiences and design philosophy behind these new 3D tools. We feel we have entered the age of UX with respect to the 3D design space, what features a software has is starting to become negligible and the focus is shifting more to how well it fits into a user’s workflow and how easy it is to operate.
It feels like the classical CAD software has advanced much faster than the user (at least the more standard user).
Companies like OnShape understand that features are important, but ease of use and user workflow are paramount. When it comes to AR/VR some of the larger guys are going to need to shift focus towards user experience if they are looking at bringing their software to full immersion.
How do you see design embracing VR and AR in the next five years?
We are already seeing a large uptake in VR with companies like General Electric and Jaguar Land Rover using VR as a presentation and review tool. Every company and design studio we work with has a VR setup, and some even multiple setups. There is a genuine interest, need, and desire in this space for specific productivity applications.
The primary purpose we have seen and will continue to see for the next few months is the use of VR as a visualization and communication tool. It is amazing to review an idea at 1:1 scale in an immersive contextual environment. Many of these companies have built a dedicated team of developers with knowledge in Unity or Unreal in order to integrate VR tech with their current workflow.
Over these next few years we are going to see loads of new software that help companies bypass the technical challenges of bringing their work into the VR space. This is when we will see an uptake by smaller studios and independents.
New input devices are already hitting the market and will start shaping new user experiences. Creation in the virtual space will become quite similar to designing physically like model making where users will have a series of tools that they can pick from to create the desired features and surfaces. As the hardware gets better the precision will increase.
We believe every design studio and several independents will have VR and AR setups and each will find a workflow that will fit their practice.
As full mobile VR still has a bit to go before it will be used as a true professional workflow tool, we believe that VR will be a fixed location (studio) type of tool, like a desktop computer, instead of one headset per person we believe there will be perhaps two for every five people in the studio. Meetings and early stage creation will happen in the virtual environment.
Designers and engineers have to think of an idea and bring that idea to reality. Over AR, VR will still remain much more of a relevant technology in the design space as it allows for limitless imagination independent of context or location. AR is great for communication of ideas with a specific context or for analysing or adding to existing designs.
In AR you always have a context, which is why we think VR will be the primary tool.
Designers and engineers create something out of nothing (their imagination) having an infinite workspace free from gravity or scale, VR has proved to be a great technology the early design phase. As AR does not totally teleport you to a new environment it will play a slightly different role in the Design workflow.
Over these next few years AR is going to start to reveal its true potential.
Our hypothesis is that the applications will be much more practice based, such as on a job site or working environment analysing an existing design and being able to have real time information on that design.
Perhaps even making suggested design refinements and things like this. At that point we will be using the object in the environment as a basis for design and engagement and communication to take place. AR will be serving as the information overlay, and perhaps some form of input to have a clear and immersive communication of an idea.
As an example, if we look at a designer designing a chair, they can design that chair with relationship to a desk or environment it will be in simply by being in that space and having props around to help gauge that experience. AR will be a much more context specific problem solving type of approach to the design phase.
For example, in the designing of the interior of a boat, one could be inside the boat, put on AR and then design the control panel of the boat in context.
Within the next five years we will also see a dramatic reduction in price of the hardware associated with AR and VR experiences, and a dramatic rise in the technology with the power of the graphics cards, mobile graphics cards, displays and optics. This will open the design space to a very wide group of people who have never had the tools or know how to create in 3D.
How does Gravity Sketch differ from existing design tools and what benefits does it have?
Unlike other artistic VR creation tools, Gravity Sketch is focused on workflow for creative professionals, and bases the creation of geometry in non-destructive solid modelling.
Gravity Sketch goes beyond strokes in space with powerful tools like surfaces, symmetry options, and revolutions (many of the familiar CAD tools). Designers, architects and engineers can create quick 3D mock ups of ideas and seamlessly bring them to more complex CAD software.
This also makes it a great companion for VR artists and CG designers using tools such as Tilt Brush. By introducing VR and AR technologies that are readily accessible to the world of digital 3D creation, we are setting ourselves apart from the classical CAD packages and at the same time are opening ourselves to a lot of learning and feedback from the community.
As the first truly trying to introduce this type of workflow to the CAD community we are under the microscope and often compared to tools like Solidworks and Fusion 360.
We like to look at ourselves as a link in your workflow, not a competitor to your favorite 3D design tool. EVERYONE (designers and engineers alike) we have worked with to test our tools told us they start with pen and paper to map out their ideas prior to getting into the CAD workflow.
This is because pen and paper allow for quick representation of an idea, and allows for a level of imperfection and ambiguity that keeps the imagination open so you can quickly iterate and refine your ideas before entering the rigid confined CAD workspace.
With pen and paper you must sketch several views of the 3D idea you have in mind at various perspectives to better understand what you are trying to bring to life.
We want to compete with your pen and note book. Why not 3D sketch out your rough idea at 1:1 scale in VR? This results in an infinite number of views and even better an ability to take that sketch directly into your favorite CAD tool.
In addition, users can import models made in other tools and manipulate them using the Gravity Sketch UX. This is extremely powerful in design reviews; users are importing complete designs and modifying them in VR.
In addition we are working on co-creation. Google Drive meets Onshape for VR is one of the key experiences we are working on. As professionals in the engineering and design space we have seen first hand how time consuming it can be to share concepts between design teams, engineers, and contractors.
With our solution teams can create, present, and collaborate together in the same virtual space at the same time. This solution will be work great with remote teams. Designers in London and engineers in The USA can work together in the digital space with no latency.
We will continue to work hard to keep users intuitively creating in 3D; turning several step processes previously needed to create 3D geometry into a one step fluid experience.
How compatible is it with other major design softwares other design tools like 3D printing?
Our goal is to be as seamless as possible with as many other design tools as possible without over-exerting ourselves. At the moment all of Gravity Sketch’s models are saved as .obj. This is a standard mesh file format and can be easily pulled into other CAD software, gaming environments, or taken directly to a 3D printer.
On the subject of 3D printing- we have created a special .stl file that is specifically optimized for 3D printing.
For the professional workflow and true integration with other platforms, we are exporting in STEP file format which is a universal file format that can be read by just about every CAD package on the market.
We are also working on our own proprietary file format which will allow users to import their Gravity Sketch files into specific CAD packages, auto populating the history tree with the tools features.
We have seen users taking models from Gravity Sketch into Rhino to use as base geometry for their final design, using the volume and the stroke style to dictate how they bring the design to life.
We also have users that have taken Gravity Sketches directly out of the software to the 3D printer.
Within the VR ecosystem, there are multiple designers in our Beta team who use Gravity Sketch to create base geometry and use Tilt Brush as the finishing rendering tool.
This type of process reminds us of the Adobe Photoshop / Illustrator relationship, where Illustrator can create rigid vector geometry, which can then be used as the frame and structure for the final photoshop rendering.
How does the VR hardware impact on the ability to sketch?
At the moment the hardware is very much focused on gamer’s needs. This is a bonus for us because the visualisation requirements are very high to deliver low latency (90fps).
I can’t image if this was not a product born out of the gaming revolution we would have the hardware at this quality and price available for designers’ advantage.
Event better, the engines that the majority of games are built on (Unity and Unreal) are almost fully open allowing anyone to build minimal to fully resolved experiences with a variety of budgets, making it a ripe space for collaboration and sharing.
The input devices on the other hand are designed for gaming. The ergonomics are quite specific for this purpose.
This is obviously shaping the experience we are able to deliver and is bringing some challenges especially when it comes to the number of features and tools we would like to offer without the clutter of menus.
The way we look as this is no different than how I imagine the designers of Autodesk looked at the mouse and keyboard way back in the day.
The tools are not specifically designed for us but we work with what we have to make the best, most productive experience we can make.
Which design sectors are most likely to benefit from what virtual sketching can offer and how do you think this will change product design workflows?
Honestly all sectors of design will benefit from a more intuitive tool. We are creating products with a mouse and keyboard! This is completely dated with our own technology timeline.
Imagine if the craftsmen of the industrial revolution were with us today? What tools do you think they would most be successful with (VR or the desktop computer)?
This being said we look for opportunities in today’s design and engineering environments. There are very few powerful industries that still rely on the human hand to determine the outcome of the final design.
Many industries have moved to this complete feeling of a digital aesthetic- where many of us can identify exactly what 3D CAD tool the designer or engineer used to create the piece.
Luckily for us, there are only 3 very big industries that rely on the human hand: architecture, automotive design, and computer graphics (animation).
These industries rely so much on the human hand and aesthetic that prior to going into any sort of CAD tool, there are a series of sketches, prototypes, mockups and concepts presented in a variety of materials such as clay, foam boards, wireframes, and simple cardboard or paper.
This is done in order for them to understand the look and feel, without losing the human essence.
The next step for them is to then bring those prototypes as close as possible to their form in the digital environment.
This is done in a variety of ways. In automotive they 3D scan their clay models. Architects often take a number of measurements and references of their cardboard or foam board prototypes. Animators takes several photos, even short videos to capture their characters essence prior to modelling with the computer.
We don’t think we will replace any modelling aids because there is a tactility and value to them.
We do think software like Gravity Sketch will play a strong role in the process, so designers can make more clear judgement calls in the sketch phase, enabling them to make more concise and clear judgements prior to making physical models, enabling them to make less, but more relevant models.
We think this is really incredible because this seems to be the first time we are seeing the harmonious intersection between craft and technology - and we hope in the future more industries will see the value in this workflow.