You are listening to the HumAIn Podcast. HumAIn is your first look at the startups and industry titans that are leading and disrupting artificial intelligence, data science, future of work and developer education. I am your host, David Yakobovitch, and you are listening to HumAIn. If you like this episode, remember to subscribe and leave a review. Now onto the show.

David Yakobovitch 

Welcome back listeners to the HumAIn Podcast. Today, I have a special guest speaker his name is Daniel Whitenack¹, and we got to know each other over the last few months. In fact, last year I was featured on his podcast. He’s the co-host of the Practical AI podcast, as well as a data scientist at SIL International². Daniel and I hashed it out, we talked about AI being HumAIn and a lot of really interesting topics. Daniel, thanks so much for being with us.     

Daniel Whitenack 

It’s great to be here and great to follow up and switch sides as it were and be over here on the side, on your podcast. Great to be here. 

David Yakobovitch

Thank you. I love it. This was a few months ago. We were talking in 2019 about being HumAIn. We were talking about what technology we’re seeing for the evolution and data science tools. We were talking about some of the work both myself and you do in the education arena and about our podcasts and what’s amazing is fast-forward from then, in August, 2019 to now in still Q1/Q2 2020, it’s a whole different world. We are very fastly becoming this space that is no longer just in person, but online, and how is working remote. 

Daniel Whitenack 

Zoom is saving the world

David Yakobovitch 

Zoom is saving the world unless it crashes. I know Roku crashed a few days ago. How has working remote? What for you is zoom being a saving grace?

Daniel Whitenack 

Being that I’m on a distributed team, actually, in some ways, like I see people talking online about I’m bored at home and stuff or like I can’t figure out that whole working situation, but it’s pretty normal for myself and my team I am fairly often on calls with people all across the U.S. but also in Singapore, and India, and Africa and all over mostly via zoom.  

David Yakobovitch 

And have you seen that as well? Like, not just in the United States, who’s been hunkering down into shelter in place for everything COVID-19 but for the other countries that you mentioned, like India and continents, like Africa have they been taking similar measures from what you’ve heard?

Daniel Whitenack 

I have calls every week with a team in India that I collaborate with and even so it was this week they similarly to actually where I’m at, they went fully remote from their office cause they’re all programmers and software engineers and that sort of thing so they’re all working from home as of now and I don’t know when they’ll come back from that, but it seems to be a similar idea.

David Yakobovitch 

One of the biggest blessings about working from home is we get to do a lot more research, especially research and data science, and similar to how we were talking about the evolution of data science and tools back in August, 2019, there’s been a lot of new tools in the market there’s been a lot of new evolution, especially around language and NLP, which I know is one of your specialties that you work on wanting to see here we are now fast forward a few months. Past  2019, what are you seeing Daniel, as some of the state of the art modern, natural language processing. 

Daniel Whitenack 

It is really interesting. Even like, if we go back even like a year ago, things, it seems like are vastly different and what’s really boosted NLP in the last couple of years are these large scale language models and so oftentimes what you’ll have in an AI model and that’s processing text is you’ll have a series either one or a series of encoders? That encodes that input into some internal representation and then a set of decoders that decodes that into some specific output that you’re interested in so in translation your encode would encode into that representation and your decoders would encode into a different language. 

For like text classification that you would again encode but then you would decode into one of a series of categories of that text and what’s really been interesting is these sort of large scale language models that have been trained like GPT-2 and BERT and ELMo, and  there’s a bunch of other ones that I’m sure people have heard of but they’re trained on a massive set of data, even sometimes for multiple languages, such that you really can apply that model to a wide range of tasks by just fine tuning to one of these tasks like translation or sentiment analysis, or text classification with a much smaller amount of data than was required before. 

And that led to this explosion and application of AI and NLP because used to NLP you thought about each problem as very different than the other, like dependency parsing or text classification or entity analysis. We’re all sort of different models, It’s just all generalized in the same sort of framework. 

David Yakobovitch 

It’s also amazing to think that there’s been so many different tests, you mentioned GPT-too, but we’ve seen from data sets like Squad-Two, all the different ones that open AI have worked on that the high performance is not just in one use case anymore but it’s all across the industry that we’re seeing high performance in NLP. What do you think has led to a lot of this momentum? 

Daniel Whitenack 

Part of it is that the size of the models has increased a lot and they’re processing a lot of data and so these word embeddings or these representations of texts that are learned in the model actually encode a lot about language in general so it’s been shown in a couple of studies that you can actually sort of backtrack out of these embeddings, the actual traditional syntax structure of texts that linguists are familiar with like grammars and such and so in these embeddings is encoded a lot of information, which makes it much easier to adapt to all of these different tasks. The second is a lot of these models are being trained on multiple tasks at the same time. 

Now, so one example that was recent was Google’s Model, which really just assumes that the input is text and the output is text, but that input text could include a little tag at the front that says like translate this into German or another tag, which is like, give me the sentiment or another tag, which is like, answer this question and it’s actually being trained on all of those tasks at the same time and so it’s learning the best representations of language for multiple tasks which makes it easier to sort of fine tune and transfer, learned that model  to a wide variety of scenarios.

David Yakobovitch 

So we’re talking about the phrase transfer learning, and this is something I heard a lot in 2019 that a lot of researchers, especially from MIT Media Lab, saw that we’re having breakthroughs here that transfer learning was possible and we’re starting to see this as you mentioned with translation from languages like English to German, but how possible is transfer learning today. Are we seeing this also with like English to Mandarin, English to Arabic? What’s that looking like? Or where do you think the state of transfer learning is?

Daniel Whitenack 

Transfer learning depends a lot on that sort of parent model that you transfer from and there are sort of very multilingual models out there some including up to a hundred and 104 hundred nine languages maybe is the most and those certainly learn a lot about the language families that they’re working with and can do very interesting things and transfer very easily to tasks in a lot of those languages but I’m glad you brought up the topic of languages because that’s really, at the heart of my personal work. 

So there’s, people might not know this, but there’s actually 7,117 languages currently being spoken in the world, so these aren’t like dialects or like anything like that. 

These are actually languages that are being spoken in the world and if we think about a multilingual model that has like 104 languages in it and it’s Embeddings that it’s language model supports, that’s a drop in the bucket and some tasks like speech to text, or text to speech especially in NLP platforms only support maybe 10 to 20 languages and so there’s a long way to go in terms of NLP for the world’s languages NLP certainly is not a solved problem at this point and there’s a lot to do that being said there’s a lot of interesting things and you’re focusing on the transfer learning part. 

There’s really a lot of interesting things that we can do with these modern transfer learning techniques for local languages. Let’s say you’re interested in translating, creating a translation model for a vernacular Arabic which has only spoken in certain areas. Well there’s a lot of language that’s available for standard Arabic. 

So you could train a English to standard Arabic Language model on a ton of data and then transfer a, learn that model to that Arabic vernacular, which is very, it’s a very similar language and so you’re going to have a pretty good chance. So if you everage the family tree, which is a lot of what we study at SIL and what languages are out there and how they’re related, then you can take advantage of these transfers learning things in sort of creative ways. 

David Yakobovitch 

It’s interesting that you mentioned about vernacular Arabic. I just recently was working in the Middle East prior to the whole COVID-19 shutdown and actually was in some Uber’s with drivers who only spoke Arabic and they didn’t speak English and I’ll tell you it was challenging. 

I got to learn some of these phrases when I was at out there with tour guides and people in the business district but it is a lot to learn and something I saw that came out on the market just recently, which I wish was there a few weeks ago but wasn’t is that Google Translate now has announced just in this past week about their live real-time transcription feature for Android. 

Prior to this I was on Google translate actually in the Middle East and typing in the phrase in English, translating it to Arabic pressing that play the audio or so forth but it’s amazing now all these breakthroughs are happening in 2020. 

Daniel Whitenack 

I’m really hoping that what we start to see in 2020 is a an acceleration of this technology through the language sort of the long tail of languages because with 7,000 languages if we tackle like one language every six months or 12 months or something like that it’s going to take us a long time to support things like translation or speech to text in 7,000 languages, so I’m hoping that we see some sort of rapid adaptation technology come about in 2020 that will let us tackle, 40, 50, a hundred languages more at a time. 

I’m really encouraged that we might see that because for those populations out there that maybe they get a, COVID-19 notice in a language that they don’t understand they’re actually further marginalized because of that if they had that technology to do the translation, or if they had that technology to interact with their device via voice or to access educational material or whatever it is, then there’s new opportunities that are brought about and they’re able to operate in the language that they value and so that’s part of what I hope, I see this in NLP momentum that’s where I hope it goes.

David Yakobovitch 

I have high hopes for that as well last year I was at an in-person conference in New Jersey called the voice AI summit and it was all around conversational AI. I spoke with Noel Silver, who’s the head of digital at NPR for all their podcasting initiatives, and prior let a lot of AI product management of Microsoft and and I shared the keynote stage and one thing that we talked about is the dying of languages and particularly we brought up the language of Icelandic and we talked about how this language has been changing with the millennials and the new generation it is facing the threat of digital extinction and we went into this conversation to say that a lot of apps are not being built language first to support all languages they might only support English, and Mandarin, and Arabic and the big nine, if you will and perhaps seeing now from Google Translate and your efforts as well at Ethnologue could be helpful to bridge that gap.

Daniel Whitenack 

Definitely and part of the reason for that is that these underlying building blocks of apps, so whether that be the APIs that support chat sorts of functionality, which are like entity analysis and sentiment analysis and speech to text and these sorts of things, those building blocks just aren’t there in the other languages without you building a custom one yourself we’re definitely. 

SIL has been collecting data and like 2000 languages for since like 1930s and I’m really excited to be part of teams that are starting to leverage that those existing resources, which really haven’t been tapped into I don’t think because they’re archived in weird ways they’re not in the sort of formats that like AI people typically are used to working in, so we’re just at the tipping point where we can really jump in and utilize a lot of that data in creative ways. 

David Yakobovitch 

That’s completely true and we’re talking here about languages today.

There needs to be a lot more support again we’re hearing it for English and Spanish and German and Chinese and Japanese. Even TensorFlow by Google now supports multiple languages so we’ve been seeing a lot with Mandarin and I hope we’ll see a lot more support with other languages, Daniel and your thoughts, whether some of the languages that you feel are dying or you’d like to see more support for.

Daniel Whitenack 

I’m actually, I’m working with our chief research officer here at SIL which he’s really interested in helping us understand where we should put our efforts in terms of our initial development and that’s really an interesting question because there are certain languages, like you say that maybe aren’t being used in the same way that they were before. There’s other languages that would be used digitally, they’re just not supported yet and there’s economic concerns and literacy concerns and all of these things all wrapped up and so we have a lot of data around all of that those things. So around economic factors in various countries, around the populations of language communities, where those language communities are how that overlaps with the economy how that overlaps with existing data sets and that’s really what the Ethnologue, which you mentioned, they track all of that. 

Gary what he’s been doing is looking at sort of hot spots of where we should be putting effort and a lot of those hotspots are in areas like the Philippines, where there’s a lot of languages. 

That aren’t supported, but that people are already fairly digital and there’s a lot of economic development there also in Indonesia is one of the biggest emerging markets in the world and there’s over it’s 700 or so languages that are spoken just in Indonesia and there’s these big tech unicorns that are already existing in Indonesia and so those are two that come to mind that there’s really not much support out there yet, but both economically and impact-wise for those language communities that could that could really make a pretty big impact.

David Yakobovitch 

You’re right, all these developing and frontier nations, we’re going to see a lot not only with languages, but literacy rates and how important it is that we do bridge the gap on that digital divide it’s not just what these language apps we’re talking about what you’re doing at SIL what we’ve seen with Google Translate this year what you’ve mentioned with Ethnologue log, but it’s not always HumAIn interaction a lot of it could be automated a lot of it could be quality control all this QA with new systems that we’ve seen go online since 2015 like chatbots.

It almost seems that today chatbots are everywhere on every app on every website, it has become synonymous. I pull up a website and the first thing I see is a chatbot popping up. It’s as if they’re everywhere and are chatbots, one are they being used for all these different languages and two, how good are they? So let’s start there, where are they on the languages that they’re supporting?

Daniel Whitenack 

So similarly to what I mentioned before the language support in terms of the building blocks of digital assistants and chatbots are fairly limited in terms of languages, dialogue flow Watson these other ones that are typically used support and they’ve done a great job at building and support, so not to speak poorly of them or anything but it’s still in the tens to maybe 50 languages, a hundred languages and so for building applications like Indonesia or the Philippines. 

Like we’re talking about, there’s still a lot of support that’s lacking one of the areas that we’ve been putting some effort into recently is question answer because what we found in running sort of chatbot studies in various places around the world is that for people that are just accessing the internet fairly newly, they maybe on a smartphone one of the things that they really want to do is just question answer they just want to ask a bunch of questions and get answers. 

They’re not so familiar with how to get those answers or maybe the resources aren’t in those languages, so supporting question answer, would be a big deal in these and we’ve started to work on that front there’s also some new newer question, answer data sets from Facebook and Google that are actually multilingual question answer datasets so I’m hoping that those bridge part of the gap there, but for chatbots in general, I would say that there’s less support for those than there is for like a general technology like Google Translate or machine translation. 

So it’s fewer languages than that, but you can do again, some creative things to bridge the gap, like doing some of this transfer, learning and other things to build custom components under the hood to support new languages.whoever does crack the nut of rapidly adapting those technologies for hundreds of languages at a time, rather than, one at a time every six months they’re going to have some great momentum in terms of economic success but also impact for a language communities.           

David Yakobovitch 

That makes complete sense and as you’ve mentioned Daniel, has been a lot of these data sets out here that are beginning to bridge that gap where we have researchers on competitions like Kaggle or from big companies like Facebook and Google who are working at it Stanford had their squad the Stanford question answering dataset they’ve recently released version 2.0, which has been helpful for reading comprehension again, mostly focusing on the English language but still helping us make a lot of progress with transfer learning. 

Beyond that we’ve seen, of course with Google they’ve had their own, they have their natural questions, dataset which has been for question answering and they’ve been doing a lot of work in open domains and similarly, as you mentioned with Facebook, there’s a few initiatives there, one of them that I’ve enjoyed looking up before is I don’t know how it’s pronounced as a baby, the baby project all about children and children learning and baby tasks. To basically get AI to the level of a human child if you can get it to that level, that is a big breakthrough on a question answering what are some other data sets that you’ve seen as well. 

Daniel Whitenack 

So I played around quite a bit with, I believe it’s called MLQA from Facebook so actually, if you just go to Facebookresearch/MLQA on GitHub, it’s there so that’s the first multilingual or I might be wrong about that, but it’s one of the first multilingual question, answer data sets that I’ve seen, which I’m hoping really. So like, if you imagine going into a new language community with a virtual assistant, imagine if that virtual assistant had the ability to query an natural language, form query Wikipedia articles, right or something like that so that that’s this sort of thing that this could enable there’s still other pieces of that puzzle, like document search and that sort of thing but this is a big step in the right direction. 

David Yakobovitch 

It’s amazing to see that these data sets are out here and data sets are made open source to help solve problems whether we’re looking at any of these data sets or we’re even thinking of more modern times and modern times including COVID19. There has been a huge initiative in the last few days where the White House in America said we are urging AI experts to develop tools for the COVID19 dataset so it has been a call to action 

Actually on Kaggle they’ve released the COVID 19 data sets so if you are someone who is interested in natural language processing this was just recently announced on Kaggle and the data sets include thousands and thousands of documents and research papers on everything. COVID 19 SARS, Ebola coronavirus to better understand how the extract knowledge and valuable information from these text documents isn’t that amazing seeing all of humanity work together? 

Daniel Whitenack 

It’s pretty cool and I know that there’s also efforts with there’s various websites where you can donate computing time to sort of do different like protein folding operations and that sort of thing to help, so that’s another way. I don’t know if it’s the same exact data. 

Alan AI also has a page where they’re talking about the various data sets related to COVID-19 one of the things I really have appreciated about Alan AI in that sort of front is that they also operate the semantic scholar project which we’ve talked about on the Practical AI Podcast as well and that allows you to really quickly find related research and search through research tag research and it brings it bubbles up to the surface the things essentially that you should be looking at if you’re looking through a train of thought in terms of new techniques or new research and so combining something like that capability along with this sort of data could prove to be very interesting as it bubbles up the correct things at the same time as providing access to the right data. 

David Yakobovitch 

It’s amazing to see how many open source platforms are out there today, including pre publishing platforms Daniel, whether we look at archive or even bio archive, and there’s all these incredible platforms you’re talking about semantic scholar we’re talking about Kaggle with the Allen Institute from the late founder of Microsoft, Paul Allen. 

One of the saving graces of COVID-19 is the world is really coming together in ways that we’ve never seen before looking back at 2019 early 2020, it seemed that we were moving towards isolationism from a research perspective but it’s as if now this has triggered humanity back into a single mission and motive together to collaborate and work.

Daniel Whitenack 

It’s definitely an interesting time and I know that there are people like to focus on, how everybody’s there’s a lot of disruption and that sort of thing and that’s definitely true and there’s a lot of people experiencing real suffering out there but at the same time there also some new opportunities that are arising I was part of a group called Connie who is working on machine translation for African languages, and we submitted, a paper to I clear which is going to be part of one of the workshops there and I wasn’t going to be able to attend there for various reasons, unrelated to COVID-19 but now they’ve made the conference completely virtual and so now because of that, I feel like I’m going to actually be part of things now and so I clear is a huge machine learning research conference and that’s another great opportunity to follow the state-of-the-art and join in virtually where whereas you might’ve been not been able to travel there and they might not have been putting in the effort to make the virtual experience really good but now it’s all virtual. 

So that’s all of what they’re putting the effort into and so it’s a real opportunity for people to learn a lot to participate in discussions to watch the sort of luminaries in the field and all of those things. 

David Yakobovitch 

So looking at these conferences, I know last year, neuro reps, particularly, they said we are limiting the number of seats for conferences we are doing a lottery system and there were researchers in the STEM academics who could not get to the conference but this is amazing to see that now something like I clear, where there could be who knows thousands of people going to this conference? Now we could even have hundreds of thousands of people tuning in dialing in online.

Daniel Whitenack 

There are people exploring a lot of new ways to keep the momentum going in the AI community as well. I know you and I were talking about a few of those before this conversation. There’s the Kaggle related things and those things I’ve been exploring personally how to. 

I do a lot of workshops at conferences and I had three or four plan this year and that’s all thrown up in the air I decided to put together a sort of virtual AI workshop like what would be at the conference and even more that I’m going to do in May called AI classroom and that’s going to cover all the fundamentals of AI hands-on examples with both PyTorch and TensorFlow and a bunch of other things, examples of computer vision and natural language processing and so my hope along with my efforts here that we’ll be seeing a lot more of these sorts of things pop up which we probably should have been seeing before this. 

COVID-19 has forced us to consider what’s the best way in our modern world to like do something like an AI workshop it might not be like the day before a conference is the best way to do that because it limits the people that can be there, it limits some of the interesting things we can do with technology and in terms of sharing content and files and that sort of thing, especially on conference wifi experimenting with some of these other virtual solutions, I’m really excited about that. I’m really excited to et more involved in that side of things and try something new out. 

David Yakobovitch 

So, if we’re looking at virtual conferences, if we’re looking at virtual training, what are some of the ways we can keep up virtually besides attending them? Are there certain tools that you’re finding very helpful today?

Daniel Whitenack 

well, one of the things that I use a ton and which all for sure feature in that training is Google CoLab and because you can share that work so easily for those that aren’t familiar with Google CoLab is it’s like a hosted Jupyter notebook, but it’s in Google Cloud and it operates a lot like Google Docs in the sense that you spin up a new collab notebook and then you have that Jupyter interface where you can do things. 

But it also has a bunch of stuff preloaded you can share it with team members very easily and even comment on each other’s work and you’re able to run experiments very quickly on a GPU or a TPU that’s automatically connected to that CoLab instance and now they’ve come out with the Colab Pro which you and I were mentioning and I was mentioning right before the jumping on this podcast.

I had like four different things running and in Colab notebooks on GPUs and for super cheap a way cheaper than it would be to like order a GPU and have it sittin in my office I’m really excited about that it’s a lot of it makes collaborating on these types of experiments very easily It’s not the best way to productionize things definitely because it’s still a Jupyter notebook. You don’t really productionize AI work through a Jupyter notebook but it does make sort of collaborating on ideas very easily and sharing things and training or sharing tutorials and that sort of thing. 

David Yakobovitch 

Daniel, don’t you want to have your own T4 or have your own K80 or your own P100 tower sitting at home?

Daniel Whitenack 

I would love to have that, unfortunately I don’t have the resources for I’d have to, I’d have to sell quite a few things to fund that for sure that’s one of the this is, so Co-Lab I always feel like I get super powers when I use it a little bit because working for a nonprofit, it’s not like we just have like data centers full of GPUs that I can access. Sometimes we have to get pretty creative with funding this has been a big one for me.

David Yakobovitch

I love it too and I’ve even recommended to students, especially when we encounter infrastructure challenges, offline, spin up a Google Colab instance. You could even accelerate it with GPU and TPU, which is more than you could even do on Kaggle, even though they’re both Google companies at this point. Google Cloud is excellent and I love the software so I strongly encourage that beyond what we’re seeing with Google Colab Pro and Kaggle. 

We are now fully living in this remote society which is showing how humans can work well with machines, of course it will not always be like this there’s going to be ebbs and flows times when we’re more in person times when we’re more remote and really what this goes to show for us is how during a pandemic, we can keep up with each other. I for myself in New York City have hosted already a couple digital dinners, basically the zoom dinner series, where we talk about tech and business and life and you can have more than 10 people. You can do breakout rooms, there’s even all these cool new apps that let you even do like professional speed dating online. 

There’s one I was just exploring recently that we’re going to be using for one of our dinners called Hoppen, which lets you literally do speed dating when you’re on your event, so nice if you have a remote dinner, that’s more than 10 people, maybe it’s 50 a hundred and you want to get to know each other and you can learn that or definitely connect with Hopkins chatbot and see if it’s available in your language as well beyond that, we are in this pandemic, but keeping a virtual is very important one thing I’ve noticed both yourself and myself Daniel is being hosts and co-hosts to podcasts is I’m picking up the phone more I’m on video more. I feel like human interaction is back again even more than prior to the pin. 

Daniel Whitenack 

It is interesting in that way how dynamics shift, and this is definitely something that we’ve seen in chatbots across the world as well, where there’s a lot of people that wouldn’t maybe have a long conversation or multiple interactions in a social setting but they’ll spend so much time with the chatbot asking one question after another just to like, see what happens to some degree because they get the nugget of truth every once in a while and so there is a sort of different dynamic when you’re interacting whether that be with the with a chatbot or with a person online, there’s a sort of there’s a different feeling and a different dynamic that’s at play for sure.

David Yakobovitch

A lot of it again is this digital twins of having these identities through chatbots and these conversations recently Amy Webb came out with her Future Today Institute reports and one of the biggest ones was about digital twins in 2020. We’re all going to be deep fakes and dub fate, stupid faked and all of these things right around texts, audio, and video there’s a lot of promise there’s a lot of conversations being had in this space around NLP around chance for learning around text and particularly as I’ve mentioned you’re also a podcast host of practical AI. Can you share with our listeners today what are some of the conversations that yourself and your co host Chris have with your listeners?

Daniel Whitenack 

This has been a great experience, we’re on episode 70 something maybe 80 now and really enjoyed this experience the show is really focused on as you might have guessed the practicalities of being an AI developer these days and not only for those that are currently AI developers, but those that would like to be AI developers so we dig into a bunch of the different technology talking about what is Burt? What does it mean? How is it used but also talking to practitioners themselves we just have the other day, talked to a team from Esri, who’s using satellite imagery to help predict blockages in along roads that the U.S. military is using for disaster relief efforts to help route resources to people that need them and so there’s all sorts of amazing stories like that we’ve had people on from all the major AI players like Hugging Face and Google and open AI and Microsoft and Amazon and all, it’s been great it’s an excuse for me to have a lot of amazing conversations and learn about a lot about AI myself.

David Yakobovitch

It’s amazing to see how the industry continues to grow and there’s so many companies there. Hugging Faces has done a lot of great research. A lot of the companies you mentioned are doing great research from a trend perspective, you’ve spoken with a lot of the foremost thinkers and researchers in the space. What are some of the trends or directions in industry that you’re seeing for NLP? Throughout 2020, and even into 2021. 

Daniel Whitenack 

It’ll be interesting this year to follow both reinforcement learning and generative adversarial networks scans both of those technologies get a lot of hype because of some of the things that they power like deep fakes and other things that you talked about but that we haven’t really entered into a season where reinforcement learning and GANs are really powering a lot of enterprise applications the way that deep learning models have actually penetrated enterprise applications and so it’ll be interesting to see if those technologies can penetrate. 

I see some movement towards that direction now, it’s easier to make those technologies practical these days and I was even at a talk at the data points summit in Chicago a couple weeks ago and the one of the people there was using reinforcement learning in demand forecasting for marketing applications I know that it’s out there and people are using it. 

The other thing that we’ll see a lot of are multimodal sorts of applications where we’re no longer just talking about, this is a text application like we’re doing machine translation or this is a speech application or this is a computer vision application we’re gonna see more and more models and applications and usages of AI out there that take both imagery and texts for example, to make some prediction open is a recent robot hand manipulating the Rubik’s cube. That’s a good example of that where they’re actually taking imagery, data plus sensor data and using reinforcement learning it’s all piled together so this sort of multimodal approach is something we’re going to see more of. 

David Yakobovitch

Excellent and as we move into these different modes of learning whether it’s us in the classrooms that are remote or learning with data through NLP and computer vision systems one thing that we know for certain is that the world is not stopping and we are continuing to move forward with research it’s been such a pleasure to learn from your insights and your podcast and the work you’re doing at your organization at SIL here on HumAIn. Can you share with us any call to action or what’s what’s next in the works that you’re doing today Daniel? 

Daniel Whitenack 

Today I’m working on a pretty interesting project related to COVID-19, which is translating some health-related information into many languages hopefully hundreds rather than just like a hundred, like what Google Translate supports. I’m going to be having a blog post about that soon so keep an eye out for that. Also if people are interested in using some of their at home time that they have now to level up their AI skills I would recommend checking out that AI classroom virtual training data dan.io and of course because all of your friends on this podcast are my friends on practical AI for sure you can use the coupon code partner AI-20 for 10% off and then check us out at the Practical AI Podcast and you can check out more about the work that SIL is doing at SIL.org. 

David Yakobovitch

Daniel Whitenack from SIL and co-hosts of the Practical AI Podcast. Thanks for joining us today on HumAIn. 

Daniel Whitenack 

It was great to be here.

David Yakobovitch   

Thank you for listening to this episode of the HumAIn Podcast. What do you think? Did the show measure up to your thoughts on artificial intelligence, data science, future of work and developer education? Listeners, I want to hear from you so that I can offer you the most relevant trend setting and educational content on the market. 

You can reach me directly by email at david@yakobovitch.com. Remember to share this episode with a friend, subscribe and leave a review on your preferred podcasting app and tune into more episodes of HumAIn.  

Works Cited

¹Daniel Whitenack

Companies Cited                    

²SIL International