{"id":1705,"date":"2016-06-06T12:14:04","date_gmt":"2016-06-06T12:14:04","guid":{"rendered":"http:\/\/projectsimply.com\/?p=1705"},"modified":"2016-06-06T12:14:04","modified_gmt":"2016-06-06T12:14:04","slug":"our-journey-towards-a-better-way-of-prototyping","status":"publish","type":"post","link":"https:\/\/projectsimply.com\/our-journey-towards-a-better-way-of-prototyping\/","title":{"rendered":"Our journey towards a better way of prototyping"},"content":{"rendered":"
Our clients are busy people. When we send them a wireframe or UI design to review, they need to understand what they\u2019re seeing.<\/strong><\/p>\n Usually we would create static wireframes in our tool of choice, whether Illustrator, Photoshop, and more recently, Sketch. We then send these wireframes onto the client, expecting them to imagine what would happen if they clicked a button or dragged an image.<\/p>\n This is fine for most projects, with the client able to grasp what we\u2019ve designed with a brief explanation of the elements within the design and how these would interact. They would then come back with any questions and we would answer these\u200a\u2014\u200aculminating in a design which is both agreed and understood.<\/p>\n However, there are some projects that need an extra level of interaction. The ability to tap, click, swipe or drag can be key to making a great design more easily understood.<\/p>\n The key is to make a design work not just how we intended, but also how the client subconsciously expects it to work.<\/strong><\/em><\/p><\/blockquote>\n For these projects, we needed a tool that would allow us to add that extra layer of interaction and animation. These tools are known as prototyping tools, of which there are many, all vying to become our next best design friend.<\/p>\n The aim of this article is not to give a direct comparison between all the tools we evaluated, but to elaborate on our experience of using them\u200a\u2014\u200aand ultimately, decide what works best for us. You might decide otherwise, which is great, that\u2019s cool\u200a\u2014\u200awe\u2019re not monsters.<\/p>\n The quality and functionality of these prototyping tools vary wildly, but they all work in at least one (or more) of three ways; Page-based, Element-based and Hotspot-based. Allow me to explain\u2026<\/p>\n is the ability to link individual screens together to navigate through them one-by-one. There is commonly only one interaction assigned to the whole screen, which is usually in the form of \u2018On screen click, move to screen X\u2019.<\/p>\n Depending on the level of functionality, this interaction can be combined with an animation (left swipe, fade in) when moving from one screen to another.<\/p>\n is the more powerful and flexible of the two types and allows us to assign interaction(s) and animations to individual elements on a screen\u200a\u2014\u200abutton, links, images etc.<\/p>\n Being able to apply interactions to individual elements allows us to create \u2018real-world\u2019 prototypes and to consider the user flow in a more in-depth way, covering all potential routes into and through an app or website (for example).<\/p>\n is a hybrid of the above two methods and allows us to create similar prototypes but in a more basic way. We can only apply interactions and animations to the screen as a whole, but using hotspots we can define interactions within areas of the screen.<\/p>\n What this means is that you are not actually selecting an individual element and applying an interaction, but rather drawing a hotspot over the element region and defining what should happen if the user interacts within this defined space.<\/p>\n\nPage-based, Element-based and Hotspot-based<\/h2>\n
\nPage-based prototyping<\/strong><\/h3>\nElement-based prototyping<\/strong><\/h3>\n
Hotspot-based prototyping<\/strong><\/h3>\n