Program > Small-group discussion

Thursday 5 July : 14:30 - 16:00

Registration HERE


ROOM DEVILLE, LIBRARY, 1st floor, 29 rue d'Ulm

- Language learning in grounded environments: 
In this session, we will investigate how infants can benefit from grounding for language acquisition. This suggests a characterization of language and environmental input that is sufficient for infants to learn language in naturalistic setting across cultures. A second part of this break-out session will be dedicated to Reverse engineering the language acquisition capacity to construct language learning systems with weak supervision, noisy multimodal input, and few shot learning. We will discuss virtual environments or datasets to train grounded language learning agents.

 ROOM HAUY, LIBRARY, 1st floor, 29 rue d'Ulm

- Modeling language acquisition using Recurrent Neural Networks: 
Recurrent neural networks (RNNs) have achieved impressive results in a variety of linguistic processing tasks, suggesting that they can induce non-trivial properties of language. In this break-out session, we will discuss how this model can induce natural language rules by just being exposed to lots of textual data and also investigate failure cases.


ROOM PARRIN, LIBRARY, 1st floor, 29 rue d'Ulm

- Statistical learning for language acquisition: 
Infants learn their first language using multiple linguistic and non-linguistic cues . A process extracting the structure  of these multimodal cues, by finding patterns,  called statistical learning has been hypothesized to be used by infants. For instance, in speech perception and processing, transitional probabilities between adjacent sound units are thought to be computed by infants in their first life to detect word boundaries. To which extent statistical learning mechanisms are key to language acquisition ? How  does it evolve during development ? Insights from psycholinguistic experiments and modeling will be discussed.  

LSCP seminar room, Pavillon Jardin, 29 rue d'Ulm
 
 Bootstrapping language: 
In the field of psycholinguistic, a theoretical framework developed the idea of synergies  (or bootstrapping) in language acquisition. Impoverished knowledge in one area of language (eg semantic) might help infants and toddlers adjust their representation in another area (eg syntax). For instance, the syntactic bootstrapping hypothesis claims that infants learn the meaning of words by paying attention to the syntactic structure in which these words occur. In parallel ways, other researchers have proposed a semantic bootstrapping hypothesis, and other multi-level bootstrapping could also be posited. In this session, we will discuss the psycholinguistic evidence supporting this general multi-level bootstrapping framework as well as extant counter-evidence. We will also discuss the computational models that have been proposed so far in the literature implementing such hypotheses, and the insights that can be drawn from them.
 

 

 



Online user: 1