Posts by Ryan Carlson

Ryan Carlson
Ryan Carlson

Ryan is a veteran of electronics manufacturing, custom software development, promotional marketing and is a thought leader in the customer loyalty software industry. Joining The Nerdery in 2012 as a Solutions Engineer, Ryan brings his knowledge and technical expertise to The Nerdery’s marketing department as the resident Technology Evangelist. Ryan is the host of The Nerdery’s weekly podcast, editor of the Nerdery Blog, regular speaker, and is a contributing author for Wired.com and GeekDad.com.

The Evolving Technology of Social Media

This webinar explores the technology behind the tools businesses and community managers are integrating into their software platforms. This is not a discussion about which key words resonate best with an audience or optimal word counts. Nerdery developers and social media integration specialists Thomas McMahon and Doug Linsmeyer  describe the software options typically leveraged on social media software integrations. Our audience gave feedback that anybody could follow this conversation, regardless of their technical level. Social media consultants, account managers, and anybody seeking an understanding of the tools and technology going into today’s social media integrations will find this discussion useful.

Slide Deck: To view the slide deck you can visit our Slideshare page.

Bonus Q&A Podcast: (running time 9:22)

Our panel of experts follow-up with three of the  interesting questions from our live audience that we didn’t have time to address during the webinar:

  • Are location-aware tools like FourSquare still worth considering?
  • Possible technology solutions to promote a mobile business.
  • The different social platforms and how they differentiate, and more.
Play

Oculus Rift Experiment – Is Virtual Reality Ready for Business Applications?

Introduction to Oculus Rift

The Oculus Rift is a new Virtual Reality (VR) headset designed to provide a truly immersive experience, allowing you to step inside your favorite video game, movie, and more. The Oculus Rift has a wide field of view, high-resolution display, and ultra-low latency head tracking unlike any VR headset before it.

Nerdery Lab Program: Oculus Rift

Nerdery Lab Program

Lab partners Chris Figueroa and Scott Bromander collaborated on this Oculus Rift experiment; their respective Lab Reports are below. The Nerdery Lab program is an opportunity for employees to submit ideas for passion projects demonstrating cutting-edge technologies.  Nerds whose ideas show the most potential are given a week to experiment and produce something to show to other Nerds and the world at large.

Lab Report from Nerdery Developer Chris Figueroa:

How is the Oculus Rift Different from other Virtual Reality Headsets from the past?

The first thing to know is the Oculus Rift has very wide range. Previously you would put on a VR headset and have tunnel vision. It didn’t feel like a you were in the experience. This was critical because its called “Virtual Reality.” How can you feel like you are somewhere else if you just feel like you are watching a tiny screen inside of goggles?

Oculus Rift puts you in the virtual world. You have a full 110- degree field of view, which has never before been used in Virtual Reality. When you put on the Oculus Headset you immediately feel like you are in the virtual world. You actually look up and down and can just move your eyes slightly to see objects to the left and right. One of the key things about the Oculus is you have peripheral vision, just like in real life.

Rapid Prototyping at its finest

The first thing you always do is get a sense of what the 3D world will feel like. Put placeholder blocks everywhere – blocks in the size of the objects you will later put there. For example, the blocks you see below became a rocks. We placed a block there so when we put the VR headset on, we’ll know there will be something there.

oculus1

oculus2

Development Challenges

Developing for the Oculus Rift is a complete departure from developing video games, 3D movies, 3D graphics or any sort of media that involves 3D. You’ll quickly realize that things you create are making people sick with the Oculus Rift. Sometimes you won’t know what is making you sick – you just know something “feels wrong.” It’s a dilemma to have a very cool product that makes users sick because something on the screen moves wrong, or the UI is in their view or textures look wrong in the 3D world – it can be any number of things. Below is what we encountered.

1. Don’t Be Tempted to Control Head Movement

In real life you choose to look at something. Advertisers have experience in making lines a certain way with colors that guide someone’s eye to an object on a billboard, but with Virtual Reality you have to do that in 3D space. It has a whole new element of complexity that is unheard of and very few have experience in.

The easiest thing to do is just move the 3D camera so it points at something. What you don’t think about is that no one in real life has their head forced to look at something, so if you do it in Virtual Reality it literally can make people sick! It’s just ill-advised to make users ill.

2. User Interface vs World Space

The Oculus Rift wants you to feel like you’re experiencing real life. So how do you display information to users using the VR headset? The first thing people say is “Lets just put information in the top-right corner to indicate something important needed to get through the experience.” This sounds completely normal and works for everything except Virtual Reality – putting something in the view of your face will not only obstruct the view of the user – it could also make them sick!

Rule of thumb that I learned from the Oculus Rift Founder:

“If it exists in space, it doesn’t go on your face.”

3. Development Kit Resolution

The first development kit for the Oculus Rift has very low resolution in each eye. When people first put the headset on they will immediately say it’s low resolution. They are right and it was very interesting to work with because 3D objects and their edges, colors and lines don’t look the same as they do on your computer screen. Sometimes fonts are completely unreadable.

Everything must be tested before a user tries the experience or they may miss out on whatever the 3D world is attempting to show them.

4. High Resolution Textures vs Low Resolution Textures

Most people that work with 3D content or movies without restrictions know that higher resolution is better. The low resolution of the Oculus Rift made for some weird problems because higher resolution textures looked worse than low resolution textures. Even though people can look at a 3D rock and know its texture is low resolution, it didn’t matter because the high resolution textures didn’t look anything like what you wanted them to be.

Programs I used for the Oculus Rift Project:

  • Unity3D – Game Engine used to interact with 3D environments
  • Oculus Rift Dev Kit 1
  • C# and C++ (Oculus SDK)
  • MonoDevelop – I write C# on a mac with Unity3D
  • Blender 3D 2.69 with a python transform plugin I made.
  • Photoshop CS6

Lab Report from Nerdery Developer Scott Bromander:

Building 3D Modeling for the Oculus Rift

The process for this lab experiment was broken into two clear paths of work. The 3D modeling and SDK (software development kit) engine work could happen simultaneously since we had to have 3D visual assets to actually put into the environment, much like drafting a website in Photoshop before slicing it up and styling with HTML and CSS. The Oculus SDK focused more on the environment and user interactions, and I took placeholder objects in the environment and added in the realistic assets.

For my specific portion of this experiment, I handled the modeling of objects within the 3D experience. Since our goal was to create an example of a business application for a 3D simulator, I built a full-scale model of a residential house. Our experiment demonstrates how Oculus Rift could be used in visualizing a remodeling project, vacation planning, or property sales.

Building these real-world objects is a lot like sculpting with a block of clay. You start with nothing and use basic geometry to shape the object you would like to create. In this case, it was a house that started out looking very plain and very gray.

Typically in the 3D modeling process, the real magic doesn’t come together until later in the process – you change the flat gray 3D object and give it a “skin,” called a texture. Texturing requires that you take that 3D model and break it down into a 2D image. Creating 3D objects follows a specific process to get the best results.

My Process

Plan and prep; build a pseudo schematic for what would be built; create a to-scale model; texture/refactor geometry.

Tools

I used 3D studio Max to build out the front of the house, and I used measurement guides that I pre-created with basic geometry – in this case, I used a series of pre-measured planes for common measurements. I was able to then use those guides throughout the modeling experience to speed things up.

Additionally, I used a lot of the data-entry features of 3DS Max to get exact measurements applied to certain components of the house. This ensured that the scale would be 100% accurate. Once it was modeled in 3DS Max to scale, we then came up with a conversion ratio to apply before bringing the model into Unity.

Finally, we optimized texture maps by including extra geometry for repeating textures (like in the siding and roof). The trick here was to plan for it while at the same time ensuring the scale was accurate. In this case, guides help a lot in slicing extra geometry.

Photoshop for texture generation

To create textures for the house, we used photos I snapped from the first day. One problem here: I  didn’t set up the shot for texture use (lens settings), so there was a significant amount of cleanup work that needed to be performed. If you think about how we see things and how a lens captures images, it’s not in a flat space but rather a little more spherical. So using a combination of stretching/clone stamp/healing-brush techniques I’ve learned over the years, I was able to take this semi-spherized image and make it appear flattened-out.

After those textures were created, we took a pass at creating bump and specular maps. While the final product of that work ultimately never made it into the final experiment, I did follow the process. In both cases, I used an industry-standard tool called Crazy Bump. The purpose of these types of “maps” is to create the look of additional geometry without actually adding it. Basically, these maps tell Unity how the light should respond when hitting the 3D object to give the effect of actual touchable texture. So if you get up close to the siding, for example, it has the ridges and look of real siding.

Had we more time, we’d have used Mental Ray texturing/lighting to give a more realistic look, and then bake that into the texture itself. This effectively would’ve taken all of these different maps/texture/and lighting situations and condensed them down into one texture. Next time.

Challenging Aspects

One of the challenging aspects of this project was adding the actual geometry from the early designs based on “what is important” vs. using a texture. My initial thought was that if I was able to get close to these objects with the Oculus Rift on, I’d be able to catch a lot of the smaller details – planning for that and getting a little deeper in the geometry was on my radar from the get go. Ultimately though, with the prototype version of the Oculus Rift having a lower resolution than planned for final product, a lot of those details were lost.

Objects like the window frames, roof edging, and the other small details were part of the early process. You save a lot of time when you do this planning up front, but it’s more time consuming to make particular changes after the fact. While it doesn’t take a lot of time to go back and add those details, knowing their placement and their measurements ahead of time really smoothes the process.

New things that I learned

Important lesson: How to plan for the Oculus Rift since it doesn’t fit into the usual project specifications. Having a higher polygon count to work with was freeing after several years of building for mobile and applying all of the efficiencies that I’ve learned as a result of creating performant experiences for mobile. But I learned this maybe a little too late in the process, and it would have been great to include those in my initial geometry budgets. Ultimately, the savings helped us when it came time to texture. All of this is the delicate balance of any 3D modeler, but it was interesting being on the other end of it coming out of modeling 3D for mobile devices.

Things I’d have done anything differently in hindsight

I would have shifted my focus and time from the small details that didn’t translate as well, given the lower resolution of the prototype Oculus Rift that we were working with. I could have spent that time creating bolder visuals and texture maps.

Given more time, or for a next iteration or enhancement that would make for a better or more immersive experience, I’d also create more visually-interesting texture maps, build out the interior of the house, and add more tweeting-style animation – including more visually-interesting interactions within the environment.

I’d like to have spent less time on the details in the 3D-modeling portion and spent a lot more time getting the textures to a place that were vibrant and visually interesting within the setting that we ended up with. In any rapid 3D-model development, one needs to remember that it starts as a flat gray model. If you don’t plan to take the time and make the texture interesting, it will look like a flat gray 3D model. So having more time to go after the textures using some sort of baked Mental Ray set-up would have been awesome.

Which really brings me to what I would love to do in another iteration of the project: Take the time to make textures that look extremely realistic, but doing so in a way that utilizes the strengths of the Oculus Rift and the Unity engine – which can all be a delicate balance of texture maps between 3DS Max and Unity, in conjunction with how it renders in the Oculus Rift display. I think that would drive the “want’” to interact with the environment more. Then, beyond the model, include more animation and feedback loops to the interaction as a whole.

I’d also include an animated user interface – developed in Flash and imported using a technology like UniSWF or Scaleform – that would be used to make decisions about the house’s finishings. Then, as the user is making those decisions with the interface, I’d include rewarding feedback in the environment itself – stuff like bushes sprouting out of the ground in a bubbly matter, or windows bouncing in place as you change their style. Or that sort of thing – the type of interaction feedback we are used to seeing in game-like experiences, but instead are using it to heighten a product configurator experience.

Again, next time – so that’s our Lab Report for this time.

Maintaining Your Software For The Long Haul

In our webinar on maintaining software maintenance we discuss the importance of having a plan. Part of this plan is identifying and addressing the root causes of an observation, not just treating the symptoms. So, taking our own advice, we address the typical root causes of needing software maintenance. We specifically call out how we all make decisions in software development that will have an impact on the short-term and long-term lifespan of the project. This is how we define technical debt. Tune in to the webinar for a great conversation that can give you the context to better understand long-term support of a project.

View the Slides of this webinar on our SlideShare Page

Listen to the Q&A audio recording below:

(Running time: 13:15)

Play

Your Users are Yip Yips

Using UX tools to serve all your users. Yip, even the plush aliens.

By Emily Schmittler and Christopher Stephan, Senior UX Designers

As UX designers, we always want people to understand, benefit from and even enjoy the designed interfaces and experiences we’ve shaped for them. The clients and companies we work with feel the same way; however, where we often differ in approach is in how we do the work to get there. To many UX professionals the appropriate process involves engaging with and talking to members of the target-user population. Many companies assume the UX professionals they hire have built-in knowledge about their audience and don’t think spending time with users is necessary.

This thought – that we in UX have users “all worked out” – is a frequent cause for concern. Sure, we can perform expert reviews and analysis, we have principles like Fitt’s Law and “7 (+ or – 2),” and an often involuntary strong gut reaction to interfaces that are not as clear as possible. However, no set of models, rules, guidelines or physical reactions can ever effectively approximate a single human being, let alone any group of them. Humans are unpredictable, frustrating, willful and sometimes hilarious. In short, your users are Yip Yips.

Photo of two Yip Yips

These are Yip Yips. Yip yip yip yip uh-huh uh huh.


How are Yips Yips like users?

You’re likely wondering what comparison we can make between your customers, subscribers, visitors and contributors to bug-eyed, plush alien rags with a lot of mouth that sound like ⅓ of every corporate teleconference you’ve likely ever witnessed. For that you’ll have to journey with us beyond the puppets to the puppets’ behavior. Lets take a closer look at Yip Yips.

They’re not from around here

Yip Yips are aliens from a different planet. They are not like you and they do not respond how you might expect. Users don’t come from planet YourOrganization. And just because you think your instructions are straight-forward and your meanings plain, they might not mean anything to a creature from outside your intellectual ecosystem.

Their background and understanding is not like yours

Yip Yips don’t have your frame of reference, your cultural history, your idioms, jargon or catch-phrases. This means they cannot be expected to perceive your world in the same way as you, talk about it using the same words, or approach things in the same way.

They are smart and learn from their experiences

The Yip Yips are examining our world and forming reasonable intelligent opinions based on their experiences. This means that they are not passively consuming what they see and hear. They’re finding ways to understand and engage. Users are also going to – consciously or not – try to understand your offering and message. They will make connections and meaning that are sensible and relevant to themselves.

They have goals and came for a reason

While not trying to actively conquer earth – so far as we know – the Yip Yips do seem to be here to learn. They are driven by curiosity and the need to learn and understand. Your users are going to have goals, too. Like the Yip Yips, we can’t be certain what caused them to visit or engage.

5 Things Yip Yips Teach Us

(In their own special way)

Some of what you are about to see you may have seen before. Not on PBS, but in the behavior of your users. While most of your users are likely not covered in colorful plush fur and have fewer antennae, they still do what can seem to be strange, random and unpredictable things.

Yip Yips Discover Earth

While the Yip Yips didn’t find Earth – their intended target – right away, they did learn a few new things during their exploration.

Lesson 1: A user’s experience is about the journey as well as the destination

In the process of looking for Earth, each new discovery was exciting and educational, making their experience an overall positive one. Users may not always know exactly what they are looking for, so it is important to identify what they find valuable, useful and entertaining to make sure that their journey is always positive. While you might have your main navigation nailed to perfection, users may need more to engage them and keep them returning to your site or application.

Your Project: The user who doesn’t know what they are looking for is just as important to the success of your design as the one who does. User research and an understanding of their behaviors can help determine how to support exploration and playfulness in these less-directed individuals.

Lesson 2: It’s good to be challenged, in some cases

Not everything has to be easy. Again, pointing to the fact that the Yip Yips enjoy their journey to finding Earth even though it isn’t successful immediately demonstrates the joy individuals often find in being challenged. Commonly, clients misconstrue UX to be about making things “easy to use.” Au contraire! A challenge can make experiences more enjoyable than something that is perceived as easy. Consider the design of a game. Are easy games fun? Not really. Therefore, UX plays a very important role in determining level of challenge that proves engaging. In these cases, it is essential to determine an adequate level to ensure the offering isn’t so difficult that it’s frustrating – or so easy it gets quickly discarded as boring by consulting users via testing.

Your Project: It’s okay to make things harder for a good reason. Just not too hard. Usability testing is a great way to test concepts that need the appropriate amount of difficulty.

Yip Yips Discover a Clock

Lesson 3: Even a detail can scare people away

Another common misconception about user research is that it’s nothing more than asking users what they want. In this example, the Yip Yips are presented with a clock. The clock has all the attributes they were expecting; however, it still scares them away resulting in an overall negative experience. While user research may seem aimed at discovering what users want, it more importantly uncovers why they want it – allowing the designer to make educated decisions on whether the requested feature is appropriate – and what would make the experience useful. Therefore, we employ research methods to get at the core of what users are experiencing so we can address the problem instead of offering temporary relief of the symptoms. Further, we test those ideas with those users to ensure that the intended outcome is met.

Your Project: Don’t focus discussions with users on what they want, instead talk about the problems they are facing to understand their viewpoint. Use the problem space to inform design decisions, then test for success.

Yip Yips Discover a Radio

Lesson 4: Users might not want what you’d expect

In this case, you could make the common-sense assumption that people have preferences for certain genres of music, but overall the experience of music is pleasant. The Yip Yips prove that a general understanding – or a common sense approach – is not enough to design an experience that is inherently pleasant. What if – like the Yip Yips – your users have totally different preferences than you would expect [and would rather listen to static]? This is a more common occurrence than you might expect.

Your Project: It is extremely important – especially with new product concepts – to research and test ideas with real users before making a large investment to develop an idea. By involving users, you can more easily determine worthwhile features more quickly. There are many ways to involve users in the design process that go beyond asking what they want. Consider collaborative design sessions with your users, or methods that look at the problem space in a broader sense, like observation.

Yip Yips Discover a Fan

Lesson 5: Engagement doesn’t mean positive experience

Ever look at your site analytics and think, “Wow! People are spending a lot of time on our site. That’s great!” Unfortunately, you may be misreading your data; however, there is no way to prove it without diving deeper. In this video, the Yip Yips demonstrate how a user can be (or appear to be) engaged, but have a totally negative experience anyway in that they are blown across the room… twice!

Consider that sometimes users are obligated to use a system. Signing up for insurance, using a website for work, doing taxes are all examples where a user has a task that they must complete; therefore, they are likely to spend more time painstakingly interacting with a system they may have otherwise ditched when the unpleasantness began. In these cases, the analytics may look as if a user is engaged, but that’s not necessarily the whole story.

Your Project: Even if you have really robust analytics data that suggests positive or negative experiences, it is really important to dive deeper using qualitative research methods to ensure a successful redesign.

An Ecosystem Full of Yip Yips

So if users are like Yip Yips, what does that mean for how we think about users? It proves that we should be watching them a little closer, and trying to understand their motives or support their interests. Luckily, UX designers come primed with tools and methods to help you build that understanding and define where the overlap between business goals and user needs lie. And, you don’t necessarily need to spend a fortune to do it. We can help you to better know your Yip Yips in a variety of ways within your budget.

Next time…

Now that we’ve told you why you need to think of your users as Yips Yips, stay tuned for some ideas about how to do research with your users, and how to test the designs that your research has informed. Watch blog.nerdery.com for part-two of the Yip Yip adventure!

Filed under Design, The UX Files

Webinar: Mastering Interactive Development Lingo

Due to unfamiliar and sometimes confusing  language, interactive projects can end up costing more or take longer to complete than they were supposed to. One of the primary contributors to project delay is a result of what we like to call the “Miscommunication Tax.” In this webinar we took the opportunity to talk through some of the lingo that consistently lends itself to miscommunications during a development project. If you are selling interactive projects, working on interactive projects, or would like to hear what a user experience designer, a developer, and quality assurance engineer have to say about confusing terms and how to create clarity around them, you’ll want to tune in.

Bonus Content:

Filed under Nerdery Webinar

NerdCast #91: Webinar Q&A – Mastering Interactive Development Lingo

NerdCastIn response to our recent webinar about development lingo we have conversations about how to avoid paying the high price of what we call the “miscommunication tax”. Curious to learn more about what goes into a Scope of Work? Where most of the confusion occurs within a project plan? What about Quality Assurance runs into the most resistance? These are the kinds of questions we answer in regards to learning interactive development lingo.

Host: Ryan Carlson

Guests: Josh “Paro-like-arrow”, Sherman Bausch, and Justin Holman

Listen Now:Running Time: 0:21:25 / Subscribe on iTunes

Play
Filed under NerdCast

NerdCast #90: Building Upon Great Solutions With Java

In this episode of the NerdCast Brian Rowe interviews Nerdery Java developer Sarah Olson and he asks the all important questions, “What would you say ya do here?”. They talk Java, new frameworks, the pending release of Java 8 and much more.

Host: Brian Rowe (UX Manager)

Guests: Sarah Olson (Software Developer)

Listen Now: Running Time: 0:21:25 / Subscribe on iTunes

Play
Filed under NerdCast

NerdCast #86: Making the Case for UX and QA Partnerships

2014.02-Christopher-Stephan-Alex-HoferIn this episode of the NerdCast we discuss the crazy notion to bring Quality Assurance engineers into the usability testing process.  It’s not that much of a crazy notion the more Christopher Stephan and QA Engineer Alex Hofer talk throughout this discussion about bridging the gap between these two project activities concerned with quality control.

Host: Ryan Carlson

Guests: Christopher Stephan (UX) and Alex Hofer (QA)

Listen Now: Running Time: 0:22:11 / Subscribe on iTunes

Play
Filed under NerdCast

NerdCast #85: Targeted Cyber Crime – Discussing BlackPOS

NerdCast Album ArtOn this episode of the NerdCast we interview security experts Chris Wade and Jason Herbst from the Nerdery QA team. We look at the malware that was used to target high profile retail companies in a massive case of stolen data. The software called BlackPOS is a brilliant piece of software and in another context is genius in its design. Hear more about how the malware works, what it can reportedly do based on security research firms, and what Jason and Chris think of our current state of security.

Host: Ryan Carlson (Tech Evangelist)

Guests: Chris Wade and Jason Herbst (QA Department)

Listen Now: Running Time: 0:23:13 / Subscribe on iTunes

Play

NerdCast #84: Webinar Q&A – Testing Your User Experience for Usability

We get to rejoin our webinar panelists to discuss the top Q&A questions we didn’t have time to cover during the live event all about usability testing. We cover quality versus quantity, return on investment discussions, and get into some great nuts and bolts discussions around methodologies and example types of tests.

Catch the webinar on usability testing (running time 1 hour)

Host: Ryan Carlson (Tech Evangelist)

Guests: Christopher Stephan (UX) and Dave Jones (UX), and David Rosen (University of MN Usability Lab)

Listen Now: Running Time: 0:27:13 / Subscribe on iTunes

Play
Filed under NerdCast