Human Generated Data

Title

Untitled (girl and two boys sitting outside door of house)

Date

1957

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9571

Human Generated Data

Title

Untitled (girl and two boys sitting outside door of house)

People

Artist: Martin Schweig, American 20th century

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 100
Apparel 100
Person 99.8
Human 99.8
Person 99.1
Person 98.8
Bonnet 98.3
Hat 98.3
Furniture 94.7
Chair 94.7
Dress 84.4
Footwear 69
Shoe 69
Sitting 66.8
Cap 65.8
Female 64.3
Girl 63.5
Face 62.6
Portrait 61.8
Photography 61.8
Photo 61.8
Shoe 61
Kid 60.2
Child 60.2
People 59.6
Baby 58.6
Home Decor 57
Sun Hat 56.5

Imagga
created on 2022-01-23

person 27.5
man 24.8
people 24
adult 17.8
male 16.3
dancer 16
sitting 14.6
black 14.5
lifestyle 14.4
men 13.7
urban 13.1
sport 12.7
fashion 12.1
fun 12
room 11.9
world 11.9
style 11.9
silhouette 11.6
television 11.5
one 11.2
attractive 11.2
athlete 11
casual 11
portrait 11
exercise 10.9
dark 10.8
salon 10.7
modern 10.5
pretty 10.5
performer 10.5
business 10.3
office 10.2
training 10.2
alone 10
happy 10
businessman 9.7
skill 9.6
looking 9.6
hair 9.5
women 9.5
negative 9.4
youth 9.4
dance 9.4
model 9.3
city 9.1
indoor 9.1
active 9
chair 8.9
body 8.8
indoors 8.8
couple 8.7
motion 8.6
smile 8.5
blackboard 8.5
relax 8.4
professional 8.3
event 8.3
leisure 8.3
player 8.2
sexy 8
face 7.8
film 7.8
stadium 7.8
summer 7.7
human 7.5
fly 7.5
lights 7.4
park 7.4
phone 7.4
pose 7.2
dress 7.2
teacher 7.2
art 7.2
posing 7.1
cool 7.1
interior 7.1
travel 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

clothing 97.1
text 95.8
person 95.1
outdoor 91.7
footwear 89.9
human face 77.7
black and white 70.2

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 86%
Happy 77.7%
Calm 14.2%
Surprised 2.7%
Fear 1.6%
Disgusted 1.4%
Sad 1%
Angry 0.8%
Confused 0.6%

AWS Rekognition

Age 27-37
Gender Male, 99.3%
Calm 93%
Surprised 3.5%
Happy 1.8%
Fear 0.5%
Sad 0.4%
Confused 0.3%
Angry 0.3%
Disgusted 0.2%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 97.9%
Surprised 1.4%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%
Angry 0.1%
Sad 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 69%

Captions

Microsoft

a group of people standing next to a window 77%
a group of people standing in front of a building 76.9%
a group of people standing in front of a window 76.1%

Text analysis

Amazon

on
25110
on 4
4

Google

--
2 -- XAGON
2
XAGON