Human Generated Data

Title

Untitled (people eating at party, large stuffed bear in room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17204

Human Generated Data

Title

Untitled (people eating at party, large stuffed bear in room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17204

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.2
Human 99.2
Person 99
Person 98.9
Person 98.6
Person 98.6
Person 98.6
Person 98.5
Person 98
Person 97.1
Person 94.4
Crowd 86.3
Person 83.8
People 77.2
Audience 75.6
Plant 74.3
Person 71.3
Room 71.1
Indoors 71.1
Potted Plant 57.8
Pottery 57.8
Jar 57.8
Vase 57.8
Clothing 57.8
Apparel 57.8
Girl 56
Female 56
Flower 56
Blossom 56
Funeral 55.5

Clarifai
created on 2023-10-29

people 99.8
group 99
many 98.6
woman 94.7
adult 93.7
music 93.7
administration 92.5
child 92.2
man 91.3
group together 90.8
leader 89.3
education 86.4
musician 84.6
war 83.9
audience 82.6
monochrome 80.7
instrument 80.6
wear 79.5
school 77.2
furniture 75.5

Imagga
created on 2022-02-26

senior 32.8
man 32.2
person 31.8
stage 26.6
people 24.5
happy 24.4
couple 24.4
old 24.4
male 24.1
elderly 20.1
adult 18.8
mature 17.7
platform 17.6
happiness 17.2
spectator 17.1
portrait 16.2
men 15.4
aged 15.4
married 14.4
love 14.2
smiling 13.7
retired 13.6
smile 12.8
women 12.6
older 12.6
hand 12.1
entrepreneur 12
teacher 11.6
together 11.4
home 11.2
grandfather 11.1
husband 10.5
group 10.5
sax 10.4
groom 10.2
seniors 9.8
60s 9.8
fan 9.8
outdoors 9.7
retirement 9.6
looking 9.6
marriage 9.5
wife 9.5
sitting 9.4
lifestyle 9.4
holiday 9.3
cheerful 8.9
new 8.9
family 8.9
look 8.8
joy 8.3
leisure 8.3
human 8.2
girls 8.2
wind instrument 8.1
religion 8.1
romance 8
planner 7.9
indoors 7.9
executive 7.9
day 7.8
grandmother 7.8
hands 7.8
affection 7.7
summer 7.7
sky 7.6
loving 7.6
females 7.6
horizontal 7.5
fun 7.5
vintage 7.4
lady 7.3
business 7.3
black 7.2
room 7.2
handsome 7.1
romantic 7.1
follower 7.1
world 7.1
businessman 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 99.2
text 98.7
clothing 83.3
man 83.3
black and white 79.1
statue 54.3
crowd 35.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 86.3%
Calm 96.2%
Sad 3.4%
Angry 0.2%
Confused 0.1%
Disgusted 0%
Happy 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Female, 82.9%
Calm 99.2%
Sad 0.6%
Confused 0.1%
Angry 0%
Happy 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Calm 100%
Sad 0%
Surprised 0%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 90.3%
Calm 99%
Happy 0.7%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 29-39
Gender Female, 96.7%
Sad 69.3%
Calm 26.8%
Surprised 1.1%
Fear 0.7%
Confused 0.6%
Disgusted 0.6%
Happy 0.5%
Angry 0.4%

AWS Rekognition

Age 49-57
Gender Female, 94.3%
Sad 55.5%
Calm 43.3%
Confused 0.6%
Happy 0.4%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Surprised 0%

AWS Rekognition

Age 50-58
Gender Female, 94.2%
Calm 94.8%
Disgusted 2.4%
Sad 0.9%
Confused 0.6%
Happy 0.6%
Fear 0.2%
Surprised 0.2%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.2%
Person 99%
Person 98.9%
Person 98.6%
Person 98.6%
Person 98.6%
Person 98.5%
Person 98%
Person 97.1%
Person 94.4%
Person 83.8%
Person 71.3%

Text analysis

Amazon

EELA
DOS