Human Generated Data

Title

Untitled (class picture in front of school)

Date

c. 1930

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2012

Human Generated Data

Title

Untitled (class picture in front of school)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.7
Apparel 99.7
Person 99.4
Human 99.4
Person 99.4
Person 99.3
Person 99.2
Person 99
Person 98.9
Person 98.8
Person 98.5
Person 98.4
Person 98.2
Person 97.2
Person 93.9
Shorts 92.4
Person 90.5
Person 89.9
Skirt 88.4
People 81.6
Tartan 81.4
Plaid 81.4
Person 78.7
Female 72.7
Kilt 70.4
Person 67.2
Crowd 65.6
Person 61.5
Girl 57.5

Imagga
created on 2021-12-14

kin 55.3
people 29.5
male 20.5
group 20.1
man 18.1
beach 17.7
adult 17.1
happy 16.9
summer 16.7
silhouette 16.5
fun 16.4
person 16.4
outdoors 15.7
happiness 15.7
men 15.4
outdoor 15.3
sport 14.8
couple 14.8
water 14.7
child 14.4
lifestyle 13.7
leisure 13.3
walking 13.2
vacation 13.1
portrait 12.9
joy 12.5
sand 12.2
sibling 11.9
classroom 11.8
love 11.8
room 11.6
smiling 11.6
family 11.5
boy 11.3
women 11.1
dog 10.8
ocean 10.8
teacher 10.7
travel 10.6
together 10.5
friends 10.3
business 10.3
friendship 10.3
sky 10.2
sea 10.2
active 9.9
sunset 9.9
run 9.6
day 9.4
winter 9.3
exercise 9.1
activity 8.9
educator 8.8
walk 8.6
kids 8.5
enjoy 8.5
life 8.4
relax 8.4
attractive 8.4
action 8.3
city 8.3
holding 8.2
cheerful 8.1
holiday 7.9
play 7.7
world 7.7
father 7.7
crowd 7.7
running 7.7
togetherness 7.5
field 7.5
parent 7.4
park 7.4
professional 7.3
girls 7.3
sun 7.2
body 7.2
transportation 7.2
sunlight 7.1
kid 7.1
interior 7.1
businessman 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 99.3
text 99.3
soccer 95.5
clothing 93
window 88.7
sports equipment 83.9
footwear 81.2
black 77.7
old 68.1
baseball 64.3
white 63.2
man 59.5
playground 56.8
football 54.3
posing 43.9

Face analysis

Amazon

Google

AWS Rekognition

Age 21-33
Gender Male, 86.2%
Calm 70%
Sad 11.9%
Angry 5.5%
Happy 5%
Confused 2.7%
Fear 2.1%
Disgusted 1.8%
Surprised 1%

AWS Rekognition

Age 54-72
Gender Female, 52.2%
Calm 89.7%
Sad 4.7%
Happy 2.3%
Angry 1.8%
Confused 0.8%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 50-68
Gender Male, 86.7%
Calm 77.7%
Happy 15.8%
Sad 1.6%
Surprised 1.6%
Angry 1.4%
Confused 0.9%
Disgusted 0.6%
Fear 0.4%

AWS Rekognition

Age 41-59
Gender Female, 65.7%
Calm 45.9%
Happy 39.2%
Surprised 6.5%
Fear 2.7%
Sad 1.8%
Confused 1.6%
Angry 1.4%
Disgusted 0.8%

AWS Rekognition

Age 30-46
Gender Female, 60.3%
Calm 76.6%
Sad 9.9%
Happy 5.6%
Angry 4.1%
Disgusted 1.2%
Confused 1.2%
Surprised 0.7%
Fear 0.6%

AWS Rekognition

Age 23-35
Gender Female, 69%
Calm 54.4%
Sad 35.7%
Fear 4.6%
Confused 3%
Surprised 1.3%
Happy 0.5%
Angry 0.5%
Disgusted 0.2%

AWS Rekognition

Age 42-60
Gender Male, 64.6%
Calm 40%
Surprised 32.3%
Happy 14.7%
Angry 4.8%
Disgusted 2.8%
Confused 2.2%
Fear 1.8%
Sad 1.5%

AWS Rekognition

Age 45-63
Gender Male, 91.7%
Sad 62.4%
Calm 28.3%
Confused 4%
Angry 2%
Surprised 1.1%
Happy 0.9%
Fear 0.9%
Disgusted 0.3%

AWS Rekognition

Age 48-66
Gender Male, 58.7%
Calm 95.4%
Sad 3.5%
Confused 0.5%
Happy 0.4%
Angry 0.1%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 47-65
Gender Male, 82.6%
Calm 77%
Sad 10.5%
Confused 5.6%
Surprised 3.6%
Happy 1.4%
Angry 1.1%
Disgusted 0.7%
Fear 0.2%

AWS Rekognition

Age 18-30
Gender Female, 51.4%
Calm 94.6%
Surprised 1.8%
Happy 1.4%
Sad 0.8%
Angry 0.7%
Disgusted 0.2%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 25-39
Gender Male, 82.1%
Calm 96.5%
Sad 1.2%
Confused 0.5%
Happy 0.5%
Surprised 0.4%
Disgusted 0.3%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 23-37
Gender Female, 64.8%
Calm 86.8%
Surprised 6.7%
Sad 3%
Confused 1.6%
Happy 0.9%
Angry 0.5%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 46-64
Gender Male, 61.9%
Calm 73.2%
Happy 11%
Surprised 8.9%
Sad 2%
Confused 1.8%
Fear 1.4%
Disgusted 0.9%
Angry 0.7%

AWS Rekognition

Age 22-34
Gender Male, 85.1%
Calm 92.5%
Sad 1.7%
Confused 1.6%
Happy 1.5%
Surprised 1.1%
Angry 0.6%
Fear 0.5%
Disgusted 0.4%

AWS Rekognition

Age 24-38
Gender Male, 66.1%
Calm 71.5%
Confused 12.6%
Happy 6.5%
Surprised 3.7%
Sad 3.6%
Fear 1.2%
Angry 0.6%
Disgusted 0.5%

AWS Rekognition

Age 36-52
Gender Female, 56.3%
Calm 91.8%
Sad 6.7%
Happy 0.7%
Confused 0.5%
Angry 0.1%
Disgusted 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 47-65
Gender Male, 83.4%
Calm 76.6%
Happy 20.1%
Sad 0.9%
Surprised 0.9%
Angry 0.7%
Confused 0.5%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 23-37
Gender Male, 72.2%
Surprised 45.8%
Calm 37%
Confused 7.2%
Fear 3.8%
Happy 3.6%
Sad 1.2%
Angry 0.8%
Disgusted 0.6%

AWS Rekognition

Age 23-35
Gender Male, 57.3%
Calm 94.7%
Happy 1.4%
Sad 1.4%
Surprised 1.1%
Angry 0.6%
Fear 0.4%
Confused 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a vintage photo of a group of people standing in front of a building 87.3%
a vintage photo of a group of people posing for the camera 87.2%
a vintage photo of a group of people standing in front of a window 81.6%