Human Generated Data

Title

Untitled (kids doing gymnastics)

Date

1973

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15831

Human Generated Data

Title

Untitled (kids doing gymnastics)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15831

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.7
Human 99.7
Person 99
Person 98.6
Person 98.3
Person 98.1
Female 96.2
Clothing 95.7
Apparel 95.7
Blonde 95.3
Girl 95.3
Woman 95.3
Kid 95.3
Teen 95.3
Child 95.3
Collage 93.5
Advertisement 93.5
Poster 93.5
Face 91.8
Chair 85.6
Furniture 85.6
People 81.7
Indoors 80.9
Shorts 78.7
Room 76.9
Person 76
Living Room 72.6
Portrait 70.4
Photography 70.4
Photo 70.4
Chair 63.7
Hair 58.8
Flyer 56.8
Paper 56.8
Brochure 56.8
Person 54.4

Clarifai
created on 2023-10-29

people 99.8
monochrome 98.5
group 98.4
adult 98.3
woman 97.4
man 96.8
group together 95.3
wear 94.8
child 93.1
adolescent 91.8
indoors 91.2
family 90.1
dancer 90.1
education 88.3
actor 88.2
music 87.7
several 87.5
dancing 87.5
recreation 87.4
furniture 87.2

Imagga
created on 2022-02-05

window 28.2
balcony 25.5
newspaper 20.7
product 17
room 16.4
people 15
blackboard 14.9
building 14.2
wall 13.7
home 13.5
house 13.4
interior 13.3
creation 13.2
architecture 12.7
light 12.7
person 12.3
man 12.1
structure 12
indoor 11.9
glass 11.7
city 10.8
male 10.7
attractive 10.5
design 10.1
business 9.7
shop 9.5
dream 9.4
adult 9.4
travel 9.1
pretty 9.1
sky 8.9
framework 8.9
looking 8.8
drawing 8.8
lifestyle 8.7
windows 8.6
black 8.4
old 8.4
vintage 8.3
dress 8.1
barbershop 8.1
classroom 8
hair 7.9
women 7.9
indoors 7.9
cute 7.9
child 7.9
holiday 7.9
urban 7.9
chair 7.8
high 7.8
sitting 7.7
office 7.6
decoration 7.6
art 7.6
fashion 7.5
happy 7.5
outdoors 7.5
alone 7.3
lady 7.3
sexy 7.2
portrait 7.1
businessman 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 96.8
person 88.5
clothing 85.4
posing 39

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 99%
Sad 70.4%
Calm 20.3%
Angry 2.7%
Confused 2.5%
Surprised 1.8%
Happy 1.3%
Disgusted 0.7%
Fear 0.4%

AWS Rekognition

Age 35-43
Gender Male, 81.2%
Surprised 96.6%
Calm 1.9%
Fear 0.5%
Disgusted 0.3%
Angry 0.2%
Confused 0.2%
Sad 0.2%
Happy 0.1%

AWS Rekognition

Age 39-47
Gender Male, 99.8%
Sad 66.3%
Confused 22%
Calm 3.6%
Surprised 2.9%
Happy 2.4%
Disgusted 1.2%
Angry 1%
Fear 0.6%

AWS Rekognition

Age 24-34
Gender Male, 99%
Surprised 89.8%
Calm 3.1%
Happy 2.5%
Angry 1.5%
Sad 1.1%
Disgusted 0.9%
Fear 0.5%
Confused 0.4%

AWS Rekognition

Age 16-22
Gender Male, 50.7%
Sad 94.6%
Confused 2.4%
Calm 1.6%
Angry 0.8%
Disgusted 0.3%
Fear 0.1%
Happy 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Chair
Person 99.7%
Person 99%
Person 98.6%
Person 98.3%
Person 98.1%
Person 76%
Person 54.4%
Chair 85.6%
Chair 63.7%

Categories

Text analysis

Amazon

KODAK
WTTB
FILM
P
KODAK SAFETY
SAFETY

Google

WTTB KODAK
WTTB
KODAK