Category Archives: Uncategorized

CoFFEE Start Kit




Inquadramento Area

Computer Supported Collaborative Work and Learning sono aree di ricerca che si occupano dello studio e dello sviluppo di sistemi a supporto della collaborazione rispettivamente in ambiente di lavoro e di formazione. In generale la collaborazione può essere remota o co-locata, sincrona o asincrona, e i diversi possibili scenari presentano specifiche problematiche.

CoFFEE (Collaborative Face-to-Face Educational Environment) E’ statao sviluppato nell’ambito di un progetto europeo LEAD come sistema a supporto del problem solving collaborativo in classe. CoFFEE è stato sviluppato come un insieme di applicazioni basate su Rich Client Platform (RCP), il core di Eclipse, quindi eredita l’architettura a plug-in di Eclipse, e tutti i tool collaborativi di CoFFEE sono sviluppati come plug-in. Questo garantisce la possibilità di integrare nuovi tool all’interno del sistema. Altre possibilità per lo sviluppo di CoFFEE riguardano l’integrazione dei servizi di CoFFEE all’interno di Eclipse, per offrire supporto alla collaborazione nell’ambito dell’ambiente di sviluppo, o anche l’uso del sistema in ambiente business anzichè learning.

Background tecnico

Eclipse

Rich Client Platform and Applications

ECF

Altri link utili

Eclipse e RCP:

OSGi

Esempio di Rich Client Application

SWT

Background su CoFFEE

Su Sourceforge è èpssibile scaricare CoFFEE, la documentazione utente, e la documentazione tecnica:

CoFFEE verso il mobile

Risorse per uno studio preliminare per implementare una varsione mobile di CoFFEE

Android

Book Chapter

Rosario De Chiara, Ilaria Manno and Vittorio Scarano

In Pinkwart N., McLaren B. M. (Eds.), Educational Technologies for Teaching Argumentation Skills. pp.125-168 (44) doi: 10.2174/978160805015411201010125e ISBN: 978-1-60805-015-4, 2012

Articoli

2014

  • "How Quiz-based Tools can improve students’ engagement and participation in the classroom"

Delfina Malandrino, Ilaria Manno, Giuseppina Palmieri, Vittorio Scarano, Giovanni Filatrella.

The 2014 International Conference on Collaboration Technologies and Systems (CTS 2014). Minneapolis, Minnesota, USA, May 19-23, 2014. To appear.

2012

Delfina Malandrino, Ilaria Manno, Giuseppina Palmieri ,Vittorio Scarano.

2012 IEEE 12th International Conference on Advanced Learning Technologies. Rome, Italy, July 04-July 06

2010

Furio Belgiorno, Ilaria Manno, Giuseppina Palmieri, and Vittorio Scarano.

In proceeding of: Cooperative Design, Visualization, and Engineering – 7th International Conference, CDVE 2010, Calvia, Mallorca, Spain, September 19-22, 2010.

Belgiorno, F. ; Malandrino, D. ; Palmieri, G. ; Pirozzi, D. ; Scarano, V.

Collaborative Computing: Networking, Applications and Worksharing (CollaborateCom 2010), Chicago, USA, October 9-12, 2010.

Furio Belgiorno, Ilaria Manno, Giuseppina Palmieri, and Vittorio Scarano.

In Proc. of 5th European Conference on Technology Enhanced Learning, EC-TEL 2010 Barcelona (Spain), 28 September-1 October 2010. Lecture Notes in Computer Science (Springer-Verlag).

  • "Collaborative Mind Maps"

Furio Belgiorno, Delfina Malandrino, Ilaria Manno, Giuseppina Palmieri, Dontato Pirozzi and Vittorio Scarano.

In Proc. of the 5th Italian Eclipse workshop (Eclipse-IT 2010). September 30th and October 1st, 2010, Savona, Italy.

  • "Adding collaboration into Rational Team Concert"

Furio Belgiorno, Ilaria Manno Giuseppina Palmieri and Vittorio Scarano.

In Proc. of the 5th Italian Eclipse workshop (Eclipse-IT 2010). September 30th and October 1st, 2010, Savona, Italy.

  • "Collaborative Geogebra"

Emidio Bianco, Ilaria Manno and Donato Pirozzi.

In Proc. of the 5th Italian Eclipse workshop (Eclipse-IT 2010). September 30th and October 1st, 2010, Savona, Italy.

2009

"Computer-Supported WebQuests"

Furio Belgiorno, Delfina Malandrino, Ilaria Manno, Giuseppina Palmieri and Vittorio Scarano. In Proc. of 4th European

Conference on Technology Enhanced Learning, EC-TEL 2009 Nice, France, September 29&emdash;October 2, 2009. Lecture Notes in Computer Science (Springer-Verlag), volume 5794/2009, pp.712-718.

2008

Furio Belgiorno, Rosario De Chiara, Ilaria Manno and Vittorio Scarano.

In Proceedings of EC-TEL 2008, pages 401-412.

Ilaria Manno.

In Proc. of 3rd Italian Workshop on Eclipse Technologies (Eclipse IT), November 17-18, 2008, Bari, Italy. Winner of Best Student Demo (see the picture).

Furio Belgiorno, Rosario De Chiara, Ilaria Manno and Vittorio Scarano.

In Proceedings of EC-TEL 2008, pages 401-412.

Furio Belgiorno, Rosario De Chiara, Ilaria Manno, Maarten Overdijk, Vittorio Scarano, Wouter van Diggelen.

In Proceeding of EC-TEL 2008, pages 49-57.

2009

Furio Belgiorno, Delfina Malandrino, Ilaria Manno, Giuseppina Palmieri and Vittorio Scarano.

In Proc. of 4th European Conference on Technology Enhanced Learning, EC-TEL 2009 Nice, France, September 29&emdash;October 2, 2009. Lecture Notes in Computer Science (Springer-Verlag), volume 5794/2009, pp.712-718.

2007

M. Beatrice Ligorio, Luca Tateo, Rosario De Chiara, Antonio Di Matteo, Ilaria Manno, Vittorio Scarano.

Summer School "Building Knowledge for deep Understanding", at the Institute for Knowledge Innovation and Technology, Toronto, Canada, August 7-10, 2007.

Rosario De Chiara, Antonio Di Matteo, Ilaria Manno, Vittorio Scarano.

In Proceedings of the 3rd International Conference on Collaborative Computing: Networking, Applications and Worksharing (CollaborateCom 2007), New York, USA, November 12-15, 2007.

  • "Collaborative Face2Face Educational Environment (CoFFEE)"

Ilaria Manno, Furio Belgiorno, Rosario De Chiara, Antonio Di Matteo, Ugo Erra, Delfina Malandrino, Giuseppina Palmieri, Donato Pirozzi, Vittorio Scarano.

First International Conference on Eclipse Tecnhologies (Eclipse-IT 2007), 2007.

Rosario De Chiara, Ilaria Manno, Vittorio Scarano.

In Proc. of 12th Conference of European Association for Research on Learning and Instruction (EARLI2007), 2007.

Tesi

Link utili

Designing Collaborative Learning Systems: Current Trends & Future Research Agenda Angelique Dimitracopoulou. Proceeding CSCL ’05 Proceedings of th 2005 conference on Computer support for collaborative learning: learning 2005: the next 10 years! Pages 115-124

Design of Extensible Component-Based Groupware JAKOB HUMMES and BERNARD MERIALDOPublished. (CSCW) 9(1):53-74 (2000)


EnvironmentStateVirtualSensor




Environment State Virtual Sensor

  • The Environment State Virtual Sensors for Android (developed by
    Angelo Santarella) is a library that allows recognition of environmental scenarios in which an Android device can be found. The basic scenarios types are: Motion states, Noise states, Proximity state, ringer-mode state.

Project details

The Environment State Virtual Sensor for Android is a library that allows querying of environmental scenarios in which an Android device can be. The scenarios for our baseline environmental surveys were obtained by various sensors including accelerometer, microphone, and the proximity sensor. The basic environmental scenarios are scenarios of movement, noise scenarios, scenarios and scenarios of how close to call (ringermode). In each of these basic scenarios listed could be detected the internal states that characterize them. The motion scenarios are divided into four additional states: motionless, car, walk and run. The scenarios are identified by the noise was silent environment, or noisy environment. Moreover, the states of proximity and return respectively ringer-mode if the device is in a state BOXED (pocket, or liner) or unboxed, or in silent mode or normal mode. In this thesis have been addressed issues relating to the design, implementation and testing of the library, in addition to developing some sample applications based on it. In the planning phase were carried out the analysis of data from various sensors. So the choices were made for the analysis of data for each sensor, and are molded their corresponding thresholds, then placed in the library. Analysis of particular interest have been performed on the data returned from the accelerometer, in fact, by calculating the variance of acceleration on the data collected, it was possible to distinguish the states of stationary cars, walking and running. The analysis and find the thresholds have been implemented in the library, which returns the environment were found in an array of four positions, one for each baseline scenario. During the implementation phase have been met criteria of simplicity and scalability of the library, allowing ease in understanding the operation and the possibility of extending it with new sensors. After the completion of the implementation was carried out during testing on a sample of users to demonstrate the capabilities of the heterogeneity of the world of Android devices. Finally, the library has been published on SourceForge.net open source manner, as the Environment State Virtual Sensor for Android. On the basis of the library have been created sample applications. The most notable among these is the application TelephonyManager that, depending on the environment were returned from the library, change the settings of your smartphone’s phone to make it suitable for the environmental scenario detected.

Beta test

The testing of this library will verify the validity of the analysis indicated that is performed on the various sensors for different Android devices, so the aim is to determine whether the variance of acceleration is the same for all devices or is there a need for calibration instrument. This sample application shows simply states returned from the library into a string, where the first part of this string is the movement state, the second is the proximity state, the third noise state, the fourth ringermode state.

The steps to perform the tests are:

1) Download the application from:

2) Upload the downloaded application on the device, and install it.

3) Run the application LibraryTest

4) Place the phone still and resting on a flat surface and then press the start button. After about 2 seconds you should see a string divided into four words. Then stay motionless for about 5 / 10 secs. Result ==> The first part of the string displayed should indicate the "motionless" state.

5) Try to block with a finger the proximity sensor (usually next to the speaker of the device). Result ==> remaining finger on the sensor, the second word in the string displayed should change into max 5 secs from ERROR to BOXED, and releasing the finger should be expected to change the string from BOXED to unboxed.

6) Now we will try the states walking / running using a shake of the device for about 3 or 4 seconds. Result ==> The first part of the string should show the following values: WALK, RUN, UN_CAR_WALK or UN_WALK_RUN. Or you may be shown the error value in case of a strong shake. Second part of this point is the simulation of motion state of the car. So by making slight variations onto device (trying to simulate by the hand curves in the car) Result ==> should be shown the states, or CAR, or UN_ML_CAR UN_CAR_WALK.

7) The test of a noisy environment is represented by the third variable of the string given. The third variable may show the following states NOISY (noisy), NOISELESS (very silent environment) or UN_NOISY (average noise). Check on environment where you are if this state is correct.

8) Last think to do… ringermode tests, represented in the latter part of the string, it can show MODE_NORMAL states (normal mode with alarm) or MODE_SILENCED (no ring or vibrate only mode). Check on the mode telephone set.

9) Press the Stop button

10) Finally the report. If successful (ie the values shown are those reported in steps) or failure of the previous 9 steps send me an email to duky2003@gmail.com with: – Manifacturer and model of the device; – Android version or Android Rom; – And indicate me steps that has been a success (by simply typing 1) OK 2) OK, etc.), or report the problem in the step (for example 4) the device indicates "RUN" instead of "motionless");

Thanks a lot to all the testers who participate in the testing of this library.

Testing Results

Tested devices:

-LG Optimus One(P500), Android 2.2 — Everything works -HTC Desire z, Android 2.3.3 — Everything works -Nexus One, Gingerbread 2.3.4 — Everything works -Huawei Ideos, Android 2.2 — Everything works -Samsung Galaxy S, Gingerbread 2.3.4 — Everything works -Acer Liquid, Android 2.2 — Accelerometer, noise and ringermode works, need proximity fixed

Source Code

This project is avaible on SourceForge at this address: http://sourceforge.net/projects/environmentstat/ The code is open and it can be checked out via SVN. The application LibraryTest is avaible for download at this address: http://sourceforge.net/projects/environmentstat/files/LibraryTest.apk/download You can also download the sources package here: http://sourceforge.net/projects/environmentstat/files/LibState.zip/download


BenderCatch




BenderCatch

BenderCatch is an Android App developed by Raffaele De Falco that detects if the user moves the device after a period of inactivity. If events (like missed calls, text messages or emails) happened during that inactivity time, the app can launch various notifications that don’t require the user to watch at the phone. The program can detect if the device is shaken too, and give the user a more precise feedback.

App target

BenderCatch will help users that usually leave their phone or tablet, come back later and pick up the device to go somewhere. Once the device is moved, if any event happened during the inactivity period, sensorial feedbacks are launched to make the user aware of missed calls, text messages or e-mails.

Project details

The project is now split in two main parts.

Bender Library

The Library itself is composed by two Controllers, one for Motion Sensing and one for Shake Detecting. The applications using the library must implement the MotionListener interface and registers using the setListener(…) method of the Controller, then starts sending SensorEvents using the record(…) method. If motion or shake are detected, the Controller triggers the motionHappened(…) method, with an integer describing the type of gesture.

Bender Catch

The App uses the library to provide the functions described above. The BenderLogic class uses a synchronized method to perform the action, passes the SensorEvents to the appropriate Controller and launches notifications if motion or shake are detected, based on the user preferences and on the events happened in the inactivity time.

Beta test

Second stage of beta testing is in progress.

Second stage: program usage

Testers can download a beta version of the App and install it in any Android device (if equipped with accelerometer). It’s required to uninstall the old version of the app, since it was unsigned and cannot be upgraded.

Main steps to use the App are:

1) press the menu button, select the Feedback submenu and enable some sort of feedback; suggested are "Vibrate" for motion and "Voice" or "Rattle" for shake; it’s NOT advised to enable more than one feedback per event, even if it’s technically possible;

2) drag the "inactivity" bar on the main screen and select a suitable time; for testing purpose the value of 10/20 seconds is advised, for general use it’s suggested to choose at least 5 minutes;

3) for testing it’s advised to enable "Fake user data" in advanced options, otherwise no feedback will occur if there isn’t any event;

4) start the service and try to leave the device at rest for some time, then take it and see if a feedback occurs; after the first feedback, the device tries to detect a shake for some seconds; if shake occurs, other notifications are triggered;

5) report any problem, question or suggestion to Raffaele De Falco; if the App doesn’t detect correctly motion or shake, I can suggest how to change "advanced options".

Beta testing results

First impressions are:

  • Huawei Ideos / Android 2.2 works with "Medium" and "Sensible" presets, doesn’t work when in standby mode, detects shake on X and Y axis;
  • Samsung Galaxy S / Android 2.2 works with "Sensible" and "Paranoid" presets, works even in standby mode, detects shake only on X axis;

First stage: data collection

Testers can download a beta version of the App and install in any Android device. The included "Bender Recorder" application can record sensor data and save them into a .CSV file, this file should be sent via e-mail to Raffaele De Falco for further analysis.

At the moment, the developer needs recordings in four usage scenarios:

1) device at rest (i.e. on a table)

2) user taking the device

3) user shaking the device

4) user putting the device in standby using the button, then powering it up again

(many thanks to Angelo Santarella, Luca Viscito, Luca Vicidomini, Nello Sorrentino, Emidio Bianco, Ada Mancuso for their help in data collecting)

Data collection results

After rigorous data collecting and analysis, based on sensors from 6 (and counting…) different devices with different Android version, from 2.2 to 2.3.4, some conclusions seem already acceptable:

  • Nearly all devices have similar accelerometer raw values, sum for all three axis is between 50 and 100; only Samsung Corby provided weird values around 500, probably due to wrong mass calculation, since according to Android Development docs the acceleration is calculated from Ad = – ∑Fs / mass;
  • The initial "imposed threshold" value of 5.0 for motion detection, based on Huawei Ideos data, is generally acceptable for other devices except Samsung Corby; however it can be fully customized, and eventually reduced for other devices like Samsung Galaxy / Galaxy II which have slightly lower sensor values;
  • Shakes performed by users other than the developer seem to match the pattern initially detected: most of the up and down values are on the X-axis, sometimes involving Y and Z; the proposed algorithm is to be considered valid;
  • The "Sensor Standby Bug" detected on 2.2 version of Android seems to be fixed in devices based on version 2.3; only exception is LG Optimus One with a 2.3.4 unofficial version, most likely due to a porting without rewriting of the hardware drivers from the older 2.2 official version.

Source Code

This project is also hosted on SourceForge here: BenderCatch on SourceForge The code can be checked out via SVN.

The code is licensed under the GNU Lesser General Public License v3 (LGPLv3)


Current events



<googlecalendar>?height=600&wkst=1&bgcolor=%23FFFFFF&src=it0j3mgfteuv6sa5rt8qjabap4%40group.calendar.google.com&color=%230D7813&ctz=Europe/Rome" style=" border-width:0 " width="800" height="600" frameborder="0" scrolling="no"></googlecalendar>


Cena, 21 Luglio 2001



21 Luglio 2001

Partecipanti

Vittorio Scarano, Alberto Negro, Giuseppina Palmieri, Delfina Malandrino, Rosario De Chiara, Ugo Erra, Roberto Capuano, Maria Licciardi, Tania Cillo, Rosario Boccia. Roberto Capuano, Nadia Romano, Umberto Ferraro, Nadia De Vito


Pranzo, 21 Dicembre 2001



21 Dicembre 2001

Partecipanti

Alberto Negro, Vittorio Scarano, Giuseppina Palmieri, Filomena De Santis, Rosario De Chiara, Delfina Malandrino, Ugo Erra, Roberto Capuano, Raffaella Grieco, Francesca Nardone, Sabrina Senatore, Nadia Romano, Daniela Mea, Elisabetta Martone, Maria Licciardi, Tino Brogna, Umberto Ferraro, Lara De Vinco, Angelo Ciaramella, Antonino Staiano, Giovanni Acampora, Michele Cosentino


Pranzo, 22 Dicembre 2000



22 Dicembre 2000

Partecipanti

Alberto Negro, Vittorio Scarano, Giuseppina Palmieri, Filomena De Santis, Rosario De Chiara, Delfina Malandrino, Ugo Erra, Rosario Boccia, Maria Licciardi Guiseppe Cupo, Maria Barra, Amelia De Vivo, Giada Iannuzzi, Antonietta Lamberti, Pompeo Faruolo, Umberto Ferraro, Gennaro Meo, Nicola Cirillo, Daniela Mea, Roberrto Capuano, Gianluca De Marco, Carmen Rinaldi, Michele Pirone, Bruno De Gemmis, Ada Compagnone, Tania Cillo, Francesca Nardone, Pino Mea, Luca Paolino, Rosanna Malandrino, Paolo Luongo, Sabrina Senatore, Pasquale Del Gaudio, Luigi Catuogno, Paolo d’Arco


FAQSeminari




Alcune domande sui seminari

"Di cosa si parla?"

Di quello che si sta facendo, di qualcosa che si è letto, di qualcosa di divertente/stimolante che abbia a che fare con l’informatica.

"Chi può seguire i seminari?"

Abbiamo ovviamente dei requisiti abbastanza stringenti: sono ammessi esclusivamente esseri umani; quindi se siete Klingon o Liocorni, per cortesia, mandate email prima di partecipare. 🙂 Seriamente, non è necessario essere tesista ISISLab, o voler chiedere la tesi in ISISLab per seguire i seminari. Se volete, potete anche chiedere la iscrizione alla mailing list del laboratorio per essere aggiornati, settimanalmente, dei seminari.

"Dovrò fare anche io dei seminari?"

La risposta (se siete tesisti ISISLab) è "Certamente!". Tenere (diversi) seminari vi aiuta a imparare a presentare il vostro lavoro nella maniera più efficace e vi permette di migliorare la comunicazione con gli altri. Inoltre, serve come preparazione ai futuri colloqui di lavoro: molto spesso una delle domande-tipo è "Cosa hai fatto nella tesi?", semplicemente perchè conta come le domande a piacere, che sono assai pericolose, in quanto ci si aspetta il meglio che il candidato possa offrire, visto che sceglie da solo l’argomento.

"Quanti seminari dovrò tenere?"

Ogni tesista tiene, di solito, 2-3 seminari ed ognuno di questi ha un obiettivo diverso. Il primo seminario, di solito, riguarda la tecnologia/problema che sta affrontando; il secondo inizia a descrivere la soluzione che si sta adottando, mentre il terzo (conclusivo) descrive i risultati ottenuti.

"Che errori devo evitare?"

Qui c’è una lista degli Errori tipici ai seminari che viene raccolta semplicemente per fare in modo di farvi evitare noiose interruzioni (da parte mia!) ogni volta che li fate.

"Il seminario breve in seduta di laurea è diverso da quelli che tengo in ISISLab?"

Si, certamente! Ovviamente ha un taglio molto diverso e richiede una estrema cura per poter ottenere l’obiettivo: presentare in pochissimo tempo il vostro lavoro in modo che venga apprezzato dalla intera commissione. Consultate i consigli sui seminari brevi che sono disponibili