Human Generated Data

Title

Untitled (children playing on see-saws at Tampa Day Nursery)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8079

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (children playing on see-saws at Tampa Day Nursery)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 99.5
Person 97.8
Person 96.5
Clothing 92.7
Apparel 92.7
Person 90.2
Person 89.8
Person 87.4
Person 86.4
Person 83.6
Shorts 76
Person 67.8
People 64.5
Pants 64.1
Airplane 61
Transportation 61
Vehicle 61
Aircraft 61
Meal 60.9
Food 60.9
Kid 58.5
Child 58.5
Building 57.9
Girl 57.3
Female 57.3
Kindergarten 56.8
Housing 55.1
Person 48
Person 45.2

Imagga
created on 2022-01-15

trombone 76.2
brass 64.7
wind instrument 49.5
musical instrument 48.1
violin 29.9
bowed stringed instrument 25.7
stringed instrument 20.2
people 19
man 18.8
outdoors 14.9
male 14.9
portrait 14.2
person 14
fun 13.5
day 13.3
adult 13
protection 12.7
play 12.1
leisure 11.6
mask 11.5
happy 11.3
happiness 11
playing 10.9
danger 10.9
industrial 10
city 10
outdoor 9.9
sport 9.9
building 9.8
destruction 9.8
urban 9.6
walking 9.5
sitting 9.4
dark 9.2
old 9.1
dirty 9
black 9
summer 9
child 8.9
protective 8.8
nuclear 8.7
gas 8.7
sky 8.3
dress 8.1
stalker 7.9
radioactive 7.8
radiation 7.8
accident 7.8
toxic 7.8
chemical 7.7
silhouette 7.4
smoke 7.4
safety 7.4
vacation 7.4
water 7.3
swing 7.3
smiling 7.2
color 7.2
lifestyle 7.2
weapon 7.2
childhood 7.2
love 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

outdoor 99.1
text 95.9
clothing 85.6
person 83.7
house 68.3
child 52.4
old 49.7

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Female, 99.6%
Happy 96.9%
Angry 1.3%
Surprised 0.5%
Calm 0.5%
Confused 0.4%
Disgusted 0.3%
Fear 0.1%
Sad 0.1%

AWS Rekognition

Age 36-44
Gender Male, 95.2%
Calm 74%
Happy 12.4%
Sad 6.1%
Surprised 2.7%
Angry 1.6%
Confused 1.3%
Fear 1.1%
Disgusted 0.7%

Feature analysis

Amazon

Person 99.5%
Airplane 61%

Captions

Microsoft

a group of people standing in front of a building 89.4%
an old photo of a group of people standing in front of a building 84.8%
a group of people standing next to a train 72.8%

Text analysis

Amazon

GLASS
ALL
M
IX ALL GLASS CO.
IX
44465
1607
CO.
113
YT33A2
a M 113 YT33A2
a

Google

K ALL GASS C
K
ALL
GASS
C