Human Generated Data

Title

Untitled (group of school children with teacher, building visited by Lafayette in 1825)

Date

1951

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18418

Human Generated Data

Title

Untitled (group of school children with teacher, building visited by Lafayette in 1825)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Human 99.6
Person 99.6
Person 99.5
Person 99.5
Person 99.5
Clothing 99.4
Apparel 99.4
Person 99.1
Person 98.3
Person 96.8
Person 93.3
Footwear 92.5
Shoe 92.5
Housing 90
Building 90
People 86.2
Shorts 85.9
Skirt 77.9
Shoe 77.7
Person 77
Porch 73.7
Person 72.5
House 72.1
Female 71.9
Kid 67.4
Child 67.4
Outdoors 66.9
Countryside 66.9
Shelter 66.9
Nature 66.9
Rural 66.9
Person 65.9
Tartan 65.4
Plaid 65.4
Villa 64.8
Door 61.3

Imagga
created on 2022-03-04

kin 68.3
musical instrument 29.7
people 24
man 22.2
wind instrument 21
accordion 19.2
dress 17.1
adult 16.9
person 15.9
silhouette 15.7
couple 15.7
keyboard instrument 15.4
male 14.9
bride 14.4
women 14.2
love 13.4
happy 13.1
fashion 12.8
sunset 12.6
portrait 12.3
boy 12.2
room 11.8
married 11.5
wedding 11
two 11
happiness 11
together 10.5
style 10.4
groom 10
hand 9.9
classroom 9.7
men 9.4
bouquet 9.4
world 9.2
child 9.1
business 9.1
black 9
human 9
outdoors 8.9
sky 8.9
romantic 8.9
party 8.6
culture 8.5
walking 8.5
dance 8.5
sport 8.4
outdoor 8.4
summer 8.4
holding 8.2
performer 8.2
girls 8.2
dancer 8.2
group 8.1
romance 8
water 8
celebration 8
lifestyle 7.9
day 7.8
color 7.8
bridal 7.8
youth 7.7
art 7.6
marriage 7.6
wife 7.6
dark 7.5
friends 7.5
teacher 7.5
fun 7.5
traditional 7.5
park 7.4
professional 7.3
brass 7.3
smiling 7.2
office 7.2
active 7.2
holiday 7.2
interior 7.1
businessman 7.1
building 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 99.5
outdoor 99.1
clothing 95
person 94.6
footwear 83.2
woman 78.6
black and white 69.1
house 68.1
black 66.7
white 66.5
posing 40.7

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 99.1%
Calm 58.1%
Happy 29.3%
Disgusted 3%
Surprised 2.9%
Fear 2.8%
Sad 1.8%
Confused 1.2%
Angry 0.8%

AWS Rekognition

Age 41-49
Gender Male, 83.2%
Calm 92.7%
Happy 4.6%
Sad 1.4%
Confused 0.4%
Surprised 0.3%
Fear 0.2%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 47-53
Gender Male, 93.9%
Calm 49.2%
Happy 37.9%
Sad 4.5%
Disgusted 3%
Confused 2%
Surprised 1.5%
Fear 1.1%
Angry 0.7%

AWS Rekognition

Age 31-41
Gender Female, 98.7%
Happy 43.6%
Calm 33.8%
Sad 14.3%
Surprised 2.5%
Confused 2.1%
Angry 1.5%
Disgusted 1.4%
Fear 0.6%

AWS Rekognition

Age 35-43
Gender Male, 98.2%
Sad 44.3%
Happy 26.4%
Calm 12.7%
Fear 7.3%
Surprised 2.5%
Disgusted 2.3%
Confused 2.3%
Angry 2.3%

AWS Rekognition

Age 37-45
Gender Female, 90.5%
Calm 30.6%
Happy 25.2%
Sad 23.6%
Surprised 8.1%
Fear 5.9%
Disgusted 2.5%
Angry 2.1%
Confused 1.9%

AWS Rekognition

Age 41-49
Gender Male, 65.8%
Calm 85.9%
Surprised 7.7%
Sad 3.3%
Happy 1.5%
Angry 0.6%
Confused 0.4%
Fear 0.4%
Disgusted 0.2%

AWS Rekognition

Age 30-40
Gender Female, 64.7%
Sad 91.1%
Calm 5.2%
Fear 1.1%
Happy 0.8%
Confused 0.6%
Angry 0.4%
Disgusted 0.4%
Surprised 0.4%

AWS Rekognition

Age 45-51
Gender Male, 99.6%
Calm 99.1%
Surprised 0.4%
Happy 0.4%
Sad 0.1%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Male, 81.1%
Happy 86.7%
Calm 10.2%
Sad 1.7%
Disgusted 0.4%
Angry 0.4%
Confused 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 47-53
Gender Male, 82.5%
Calm 97.4%
Happy 1.1%
Surprised 0.4%
Sad 0.3%
Angry 0.3%
Confused 0.3%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Female, 65.9%
Calm 89.3%
Happy 4.6%
Surprised 3%
Fear 1.5%
Sad 0.7%
Confused 0.5%
Angry 0.3%
Disgusted 0.2%

AWS Rekognition

Age 38-46
Gender Male, 55.4%
Happy 62.7%
Calm 21.5%
Sad 9.8%
Confused 2.6%
Fear 2%
Disgusted 0.6%
Surprised 0.4%
Angry 0.3%

AWS Rekognition

Age 26-36
Gender Male, 84.8%
Happy 99.8%
Calm 0.1%
Sad 0%
Disgusted 0%
Surprised 0%
Fear 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 41-49
Gender Female, 85.3%
Happy 38.9%
Calm 35.1%
Sad 22.4%
Confused 1%
Disgusted 1%
Surprised 0.7%
Fear 0.5%
Angry 0.4%

AWS Rekognition

Age 27-37
Gender Male, 86.8%
Surprised 41.7%
Happy 30.7%
Calm 23.9%
Fear 1.5%
Disgusted 0.6%
Angry 0.6%
Sad 0.6%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 92.5%

Captions

Microsoft

a group of people standing in front of a building 92.1%
a group of people posing for a photo in front of a building 90.8%
a group of people posing for a photo 88.5%

Text analysis

Amazon

ADAMS
TEACHER
ACADEMY
FEMALE
VISITED
VISITED BY
BY
MARY
1825
1824-1828
MARY LYON
IN
LAFAYETTE
LYON
DA
YT33A2
YT33A2 NAGOY
NAGOY

Google

ADAMS FEMALE ACADEMY MARY LYON TEACHER 1824-1828 VISITED BY LAFAYETTE IN 1825
ADAMS
ACADEMY
MARY
TEACHER
1825
IN
FEMALE
1824-1828
VISITED
LAFAYETTE
LYON
BY