Human Generated Data

Title

Untitled (group of children, girl with snake)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20291

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of children, girl with snake)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.9
Person 99.9
Person 99.7
Person 99.7
Person 99.6
Person 99.5
Clothing 99
Apparel 99
Person 98.6
Shorts 97.4
Person 96.1
Female 89.5
Person 87.8
Person 85
Footwear 81.4
Shoe 81.4
People 74.8
Woman 73
Poster 68.8
Advertisement 68.8
Girl 65.7
Hair 64.4
Kid 59.3
Blonde 59.3
Teen 59.3
Child 59.3
Dress 59.2
Skirt 58.9
Paper 55.4
Shoe 53.3
Person 49.6

Imagga
created on 2022-03-05

person 35.9
people 35.1
man 30.9
silhouette 30.6
supporter 28.2
male 26.9
crowd 26.9
sexy 24.1
businessman 22.1
team 21.5
teamwork 20.4
audience 19.5
group 19.3
work 18.2
businesswoman 18.2
design 18
nation 18
cheering 17.6
lights 17.6
nighttime 17.6
business 17.6
stadium 17.5
clothing 17.5
bathing cap 17.5
flag 17.4
patriotic 17.2
job 16.8
vibrant 16.6
fashion 16.6
presentation 15.8
occupation 15.6
leader 15.4
women 15
bright 15
vivid 14.9
adult 14.8
model 14.8
boss 14.3
icon 14.2
cap 14.2
symbol 14.1
couple 13.9
supporters 13.8
president 13.7
speech 13.7
planner 13.7
headdress 13.5
body 12.8
sport 12.2
men 12
style 11.9
black 11.7
portrait 11.6
human 10.5
attractive 9.8
pretty 9.8
dance 9.6
brass 9.6
meeting 9.4
player 8.9
professional 8.7
clothes 8.4
friendship 8.4
art 8.4
pose 8.1
happy 8.1
lifestyle 7.9
love 7.9
elegant 7.7
wind instrument 7.7
lady 7.3
girls 7.3
star 7.3
posing 7.1
stage 7.1
together 7
modern 7

Google
created on 2022-03-05

Shorts 90.3
Human 89.1
Black-and-white 86.6
Standing 86.4
Style 84.1
Adaptation 79.3
Fun 77.7
Monochrome 76.2
Monochrome photography 74.2
Event 73.1
Vintage clothing 71.3
Fashion design 69.8
Team 65.3
Room 64.1
Photo caption 63.7
Leisure 62.7
Font 60.9
Child 59.1
Visual arts 58.9
Advertising 58.3

Microsoft
created on 2022-03-05

person 99.7
clothing 94.2
text 91
woman 80.5
standing 80.2
footwear 72.5
dance 72.3
dress 67.3
black and white 53.3

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Female, 57.3%
Calm 99.4%
Happy 0.1%
Surprised 0.1%
Sad 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%
Confused 0%

AWS Rekognition

Age 30-40
Gender Female, 86.5%
Happy 74.6%
Calm 16%
Sad 6.5%
Angry 0.8%
Disgusted 0.8%
Fear 0.5%
Surprised 0.4%
Confused 0.3%

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 99.7%
Surprised 0.3%
Sad 0%
Happy 0%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Female, 73.5%
Calm 98.2%
Sad 1.6%
Confused 0.1%
Happy 0.1%
Surprised 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 45-53
Gender Male, 82.4%
Happy 93.7%
Calm 2.4%
Sad 2%
Surprised 0.6%
Confused 0.4%
Disgusted 0.3%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 50.1%
Sad 80.9%
Happy 6.4%
Calm 4.4%
Confused 2.9%
Fear 2.8%
Disgusted 1.1%
Surprised 0.9%
Angry 0.6%

AWS Rekognition

Age 23-33
Gender Male, 86.8%
Calm 97.2%
Sad 2.5%
Confused 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%
Disgusted 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%
Shoe 81.4%

Captions

Microsoft

a group of people standing in front of a crowd 87.6%
a woman standing in front of a crowd 87.5%
a group of people standing around each other 84.1%

Text analysis

Amazon

SIESTA
J.
KEY,
STEINMETZ,
J. J. STEINMETZ, SIESTA KEY, SARASOTA, FLAS
23459.
SARASOTA,
FLAS

Google

STEINMETZ,
SARASOTA,
FLA:
J.
KEY,
23459.
23459. J. J. STEINMETZ, SIESTA KEY, SARASOTA, FLA:
SIESTA