Human Generated Data

Title

Untitled (couple in front of carnival booth holding stuffed animals)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10619

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple in front of carnival booth holding stuffed animals)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Person 99.2
Person 96.8
Person 96.8
Clothing 91.4
Apparel 91.4
Dog 91.2
Animal 91.2
Canine 91.2
Pet 91.2
Mammal 91.2
Person 85.2
Costume 84.9
Person 77.2
Meal 68.2
Food 68.2
Play 63.9
Female 62.2
Coat 60.4
People 58.4
Kid 58.1
Child 58.1
Shoe 57.5
Footwear 57.5
Advertisement 55.7
Poster 55.5
Person 46.1
Person 43.7

Imagga
created on 2022-01-09

man 25.5
city 20.8
people 19.5
male 19.2
urban 18.3
sport 16.3
pooch 15
person 14.5
street 13.8
men 12.9
portrait 12.3
adult 12
weapon 11.9
business 11.5
black 11
team 10.7
walking 10.4
outdoors 9.7
sword 9.7
success 9.6
athlete 9.6
legs 9.4
device 9.4
action 9.3
dog 9.1
active 8.8
happy 8.8
women 8.7
boy 8.7
wall 8.7
day 8.6
motion 8.6
travel 8.4
power 8.4
fashion 8.3
silhouette 8.3
dress 8.1
restraint 8
building 8
play 7.8
run 7.7
win 7.7
fun 7.5
speed 7.3
competition 7.3
leash 7.3
lifestyle 7.2
body 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 95.4
outdoor 95
footwear 94.6
text 93.5
person 93.2
black and white 90.7
cartoon 71.3
man 70.6

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 56.5%
Happy 64.5%
Surprised 25.8%
Calm 4.4%
Fear 3.5%
Disgusted 0.6%
Confused 0.5%
Sad 0.4%
Angry 0.3%

AWS Rekognition

Age 33-41
Gender Female, 68.7%
Happy 89.6%
Calm 7.2%
Surprised 1.6%
Angry 0.5%
Disgusted 0.3%
Confused 0.3%
Fear 0.3%
Sad 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Dog 91.2%
Shoe 57.5%

Captions

Microsoft

a group of people posing for a photo 78.9%
a group of people posing for the camera 78.8%
a group of people posing for a picture 78.7%

Text analysis

Amazon

25
SPILL 25
SPILL
MILK
34639
the MILK
IS
the
THIS IS
THIS
Dorothy'r
are
Amn
that

Google

YT33A2 THIE IS Dorohys 25 SPILL MILK DOROTH
YT33A2
THIE
IS
Dorohys
25
SPILL
MILK
DOROTH