Registro completo de metadatos
Campo DC Valor Lengua/Idioma
dc.rights.licenseReconocimiento-NoComercial-SinObraDerivada 4.0 Internacional. (CC BY-NC-ND)es
dc.contributor.authorMarichal, Sebastiánes
dc.contributor.authorRosales, Andreaes
dc.contributor.authorGonzález Perilli, Fernandoes
dc.contributor.authorPires, Ana Cristinaes
dc.contributor.authorBlat, Josepes
dc.date.accessioned2023-05-30T12:13:56Z-
dc.date.available2023-05-31T03:05:10Z-
dc.date.issued2022-05-01-
dc.identifier.citationMarichal, S., Rosales, A., González Perilli, F., Pires, A., & Blat, J. (2022). Auditory and haptic feedback to train basic mathematical skills of children with visual impairments. Behaviour & Information Technology, 1–51. https://doi.org/10.1080/0144929x.2022.2060860es
dc.identifier.urihttps://hdl.handle.net/20.500.12381/3235-
dc.description.abstractPhysical manipulatives, such as rods or tiles, are widely used for mathematics learning, as they support embodied cognition, enable the execution of epistemic actions, and foster conceptual metaphors. Counting them, children explore, rearrange, and reinterpret the environment through the haptic channel. Vision generally complements physical actions, which makes using traditional manipulatives limited for children with visual impairments (VIs). Digitally augmenting manipulatives with feedback through alternative modalities might improve them. We specifically discuss conveying number representations to children with VIs using haptic and auditory channels within an environment encouraging exploration and supporting active touch counting strategies while promoting reflection. This paper presents LETSMath, a tangible system for training basic mathematical skills of children with VIs, developed through Design-Based Research with three iterations in which we involved 19 children with VIs and their educators. We discuss how the system may support training skills in the composition of numbers and the impact that the different system features have on slowing down the interaction pace to trigger reflection, in understanding, and in incorporation.es
dc.description.sponsorshipUniversitat Pompeu Fabra (Spain) through MIREGAMIS: 2018 LLAV 00009es
dc.description.sponsorshipAgencia Nacional de Investigación e Innovación - ANIIes
dc.description.sponsorshipFundación Ceibales
dc.description.sponsorshipCentro Interdisciplinario en Cognición para la Enseñanza y el Aprendizaje - CICEA, Universidad de la Repúblicaes
dc.description.sponsorshipUniversitat Oberta de Catalunya (Spain) through Ministry of Science, Innovation, and Universities IJCI-2017-32162es
dc.description.sponsorshipLASIGE Research Unit (Portugal) through FCT project mIDR (AAC02/SAICT/-2017, project 30347, cofunded by COMPETE/FEDER/FNR), the LASIGE Research Unit, ref. UIDB/00408/2020 and ref. UIDP/00408/2020.es
dc.format.extent51 Pgses
dc.language.isoenges
dc.publisherTaylor & Francises
dc.rightsAcceso abiertoes
dc.sourceBehaviour & Information Technologyes
dc.subjectTangible user interfacees
dc.subjectVisually impairedes
dc.subjectCognitive traininges
dc.subjectTechnology- enhanced learninges
dc.subjectEmbodied interactiones
dc.titleAuditory and haptic feedback to train basic mathematical skills of children with visual impairmentses
dc.typeArtículoes
dc.subject.aniiCiencias Sociales-
dc.subject.aniiCiencias de la Educación-
dc.identifier.aniiFSED_2_2020_1_163592es
dc.type.versionAceptadoes
dc.identifier.doihttps://doi.org/10.1080/0144929X.2022.2060860-
dc.rights.embargoterm2023-05-22es
dc.ceibal.researchlineInnovación en la enseñanza y el aprendizajees
dc.ceibal.researchtemaModos emergentes de comunicación mediados por tecnología digital y su integración al aprendizajees
dc.subject.ceibalInteracción Tangiblees
dc.subject.ceibalTICes
dc.subject.ceibalEnseñanza mediada por tecnologíaes
dc.subject.ceibalVideojuegoses
Aparece en las colecciones: Fundación Ceibal

Archivos en este ítem:
archivo Descripción Tamaño Formato  
Last Revision - Main Document with Authors Details - clean copy.pdf3 MBAdobe PDFDescargar

Las obras en REDI están protegidas por licencias Creative Commons.
Por más información sobre los términos de esta publicación, visita: Reconocimiento-NoComercial-SinObraDerivada 4.0 Internacional. (CC BY-NC-ND)