Email Security

    Steve Wozniak Opens Up at the Mimecast Cyber Resilience Summit
     

    The Apple co-founder shares his views on innovation, AI, cybersecurity, business ethics and giving back to society in a fascinating Q&A with Mimecast CEO Peter Bauer.   
     

    by Mike Faden
    getty-nighttime-city.jpg

    Key Points

    • Silicon Valley icon discusses how to instill an ethical approach into organizations—and how to create disruptive innovation.
    • Wozniak agrees that cybersecurity risk is one of the most important threats the world faces today. He says AI can help, despite its limitations.
    • Quantum computing may be very useful for certain functions—but Wozniak believes that its technical requirements will limit broader applicability.

     

    For the closing keynote in Mimecast’s 2020 Cyber Resilience Summit, Mimecast hosted Apple co-founder and Silicon Valley legend Steve Wozniak in a fascinating Q&A discussion with Mimecast co-founder and CEO Peter Bauer. In addition to his reputation as a genius who invented the first personal computer, “Woz” is also renowned for his strong ethics and his efforts to give back to society, including teaching thousands of children how to use technology. The Q&A covered a wide range of topics, from instilling one’s ethical values into an organization and creating disruptive innovation to quantum computing and the role of AI in cybersecurity and in society. Here are extracts from that conversation, edited for brevity and clarity. You can watch the interview in its entirety, along with dozens of other informative sessions, on demand at the Mimecast Cyber Resilience Summit website.

    Bauer: Your career has been defined by a strong ethical compass, openness, honesty, concern for others. How have you been able to live true to your values in the face of immense success and the rapidly changing world?

    Wozniak: My father was very strong on ethics, so that’s probably where it came from. I remember in high school, I was a geek. I was an outsider. I didn't have friends to influence what my values would be, and I would go home and think them out. How do you decide things based on truth, facts, and not just [whether] you're a good talker and know how to please? I thought about what’s right and what's wrong and I always came up with: If you're truthful, it will lead to the correct actions. I made a promise to myself, and it's a promise that I've kept for life.

    Bauer: Organizations like Apple become very powerful and their values can impact a great deal of people. What do you feel is the relationship between the values of the founders and the people they choose to work with, and what those organizations become in the world?

    Wozniak: I believe in leading by example—in other words, what I care about, my design values and my personal values, I think carried into Apple’s values and Steve Jobs’ vision. As long as those [leaders] are around, those values are strong, and people admire you for them. When they admire you for them, it influences their own behavior and their own values.

    Bauer: One mantra that I've had in my relatively short entrepreneurial career is, "Don't found, co-found." You and Steve Jobs co-founded something that is an incredible legacy in the world. What was the magic that came from the combination of your abilities? How has that made you think about teams that you've subsequently been part of?

    Wozniak: It's hard to go back and judge which aspects made you win and which made you lose. Steve Jobs and I were very good technical friends and had incredible engineering skills. Steve had this excitement—he wanted somehow to get to a position in life to be one of those important people. But he didn't have the academic background and the employment skills to get a start in a big company. He had me. Each of us on our own probably wouldn't have made it. My computer would have gotten nowhere without a business. Friendship means a lot to me, and loyalty, in that sense.

    Bauer: That's such an important part of a partnership because building a business is tough—you go through a lot together, and resilience is needed in the relationship to transcend that. How do you think that translates to people in our audience today in terms of how they might think about teams and putting skills together into groups that need to innovate?

    Wozniak: Understand that what you're doing in a company or a team is not the end-all solution that makes it successful. There are lot of other people in other departments and each of those are important. Try to understand them, try to communicate, talk with people doing other things. If your personalities mesh, it works well. If your personalities don't mesh, it doesn't work well.

    Bauer: You have a real passion for teaching school kids, and I think many parents have spent much of the last few months doing double duty as home-schoolers. Why were you drawn to teaching and what advice would you have for parents who want to teach their kids to think differently?

    Wozniak: When I was young, my father spoke about how your teachers are going to give you skills that will enable you to go through college and get a job, money, a house, a family. I got this very strong value towards teachers and education. I told my father: “When I grow up, I'm going to be an electrical engineer like you—but second, I want to be a fifth-grade teacher.” Eventually, of course, I had children, they were in school and I decided, well, I should test myself as a teacher.

    I took on teaching [children] 200 hours per year, even before laptops, how to use your computer for all the assignments in school. I was voluntary and I did that for eight years. One thing I learned, as a parent, is that patience is probably the most important thing—being able to explain something politely 10 times in a row. Don't expect that somebody understands it just like you do.

    Bauer: Today, we're all dealing with a world disrupted every which way. For you, it seems that disruption's actually in your blood, and it can be a good thing: Apple has famously disrupted things again and again. How should an ordinary business think about being a disruptor—and why is that even important?

    Wozniak: [All businesses need to] keep the money generation going so you can live. But they also [need] inventors. How do you have both? How do you know whether an engineer has ideas that are so inventive that 5 or 10 years from now, that might be a huge market? It takes a lot of observation, as well as guessing. If a company is large enough, you hear of skunkworks projects. Sometimes a CEO like yourself gets an idea that doesn't fit the whole company’s plans. He grabs one of the young, freethinking engineers—one of the makers maybe—and says, "Why don't you go try this on your own time? Do it at home, I'll even fund you." That was done at Apple, in some really important cases.

    One thing I sometimes propose is to have a chief disruption officer that does not report to the CEO, but reports to the board of directors. The CEO's business should be keeping the company running and successful. The disruption team should look at what might come along in the world, studying technologies, projects, materials that could affect us and making sure we're ready for them. Also, can we think of any ideas ourselves that could disrupt this industry, so that we could be the disruptor?

    Bauer: Quantum computing, any thoughts?

    Wozniak: Of course, we all hear about quantum computing, especially if we're technology and science-oriented. I’ve been a disdainer on it to this day because I hear of the incredible things that a quantum computer could calculate, that no regular computer could ever work that fast, but then I look at the technology it takes to do it. Just for one qubit, you've got to have [the temperature] at like 0.1 degrees Kelvin [about minus 459 Fahrenheit, slightly above absolute zero]. So it's not going to be a personal product. It's going to have limited, narrow applications.

    Bauer: You spoke a little bit earlier about the human brain, which made me wonder about artificial intelligence. I'm sure we'll all have to keep working for a while, but what are the chances that our kids won’t have to do too much, thanks to artificial intelligence?

    Wozniak: Artificial intelligence does things better than humans in some cases, but it doesn't work like a human mind—because we don't know how the human mind works. Sure, you can train [AI] to play chess better than any human. You can show Google 80,000 pictures of a dog and it will recognize a dog faster than any human can. But a two-year-old child, you say that's a dog, and they look at it and know that it's got limbs and has an eye and a mind, and that it decides where it's going to go. The child knows what a dog is. Google only knows pixels, it can't even tell if the dog is a picture on a wall or a real dog.

    So there's a lot of shortcomings. Artificial intelligence obviously is going to help us so much where we can apply it, but we can only apply it in specific areas. Artificial intelligence can become smarter and smarter and more assistive to humans, but technology has always been that way. Artificial intelligence is just a keyword for the state of the art in technology.

    I started getting the feeling that someday we might be taken care of by the computers. We won't have to worry about [the necessities of] our life. The computers will just take care of all of our needs, including food and clothing. I thought that's like a family dog. At that point I started feeding my dog filet steaks. Do unto to others—if I'm going to be a family pet someday, my dog's going to be treated today the way I want to be treated. AI should be embraced because it does help us do things we couldn't do and, in some cases, it can help make us safer, make us healthier, but we should realize it has limitations.

    Bauer: On that safety angle, what do you think the intersection might be with criminals using artificial intelligence, perhaps for cybercrime? And on the other side, how artificial intelligence could be used in cybersecurity?

    Wozniak: Well, look at our fears today—fears of war, crime, being attacked. But I think the biggest, most realistic fear is in the area of cybersecurity. Everything has converted to digital, and it's all open. I think that's the biggest fear that needs the most attention. Artificial intelligence is just a way of doing something better than we've done before, which is studying millions of patterns looking for little clues that might give you a hint as to when something is going wrong. Maybe your system is being attacked and you don't know it, or the [security] camera's turned on in your house and you don't know it. Artificial intelligence and machine learning should help a lot in that.

    Bauer: Interesting. It's certainly what we're seeing as we're continuing to innovate in cybersecurity ourselves and we are aware of what adversaries are doing. I’m getting slightly more philosophical here, but what do you think is the biggest threat to mankind?

    Wozniak: Probably humans themselves. Cyberattacks are the best example that might bring mankind down just accidentally. What if everything digital stopped working? What if you had no computer or phone, what if it all got cut off because maybe some really evil parties figured out how to do it?

     

    Subscribe to Cyber Resilience Insights for more articles like these

    Get all the latest news and cybersecurity industry analysis delivered right to your inbox

    Sign up successful

    Thank you for signing up to receive updates from our blog

    We will be in touch!

    Back to Top