Human Generated Data

Title

Untitled (Headdress Ball: man and woman with decorative headdresses standing near display)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5645

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Headdress Ball: man and woman with decorative headdresses standing near display)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.6
Human 99.6
Person 98.6
Apparel 96
Clothing 96
Female 67.7
Plant 65.4
Face 64.2
Crowd 59.9
Text 58.3
Chef 57.5
Hat 57
Food 57
Cake 57
Creme 57
Cream 57
Icing 57
Dessert 57

Imagga
created on 2021-12-15

brass 37.1
wind instrument 33.9
cornet 28.4
musical instrument 24.5
man 23.5
male 22
person 21.7
device 21.7
people 20.6
adult 17.4
horn 17.2
sport 16
active 15.3
business 14.6
silhouette 14.1
outdoor 13.8
work 13.3
bugle 12.7
job 12.4
businessman 12.4
portrait 12.3
newspaper 12.2
group 12.1
sky 11.5
outdoors 11.2
men 11.2
team 10.8
human 10.5
world 10.3
black 10.2
design 10.1
lifestyle 10.1
professional 10
instrumentality 9.9
product 9.8
cloud 9.5
travel 9.2
exercise 9.1
recreation 9
women 8.7
sax 8.6
art 8.6
manager 8.4
vacation 8.2
landscape 8.2
technology 8.2
happy 8.1
activity 8.1
mask 7.9
builder 7.9
creation 7.8
summer 7.7
engineer 7.7
construction 7.7
pretty 7.7
project 7.7
grunge 7.7
old 7.7
chart 7.6
beach 7.6
equipment 7.6
power 7.6
healthy 7.6
fashion 7.5
fun 7.5
leisure 7.5
style 7.4
retro 7.4
water 7.3
sexy 7.2
building 7.1
hair 7.1
weapon 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.9
window 88.7
black and white 86.7
cartoon 86.6
drawing 63.4
animal 62.5
horse 56.1
sketch 55.5
clothing 54
posing 49.1

Face analysis

Amazon

Google

AWS Rekognition

Age 20-32
Gender Female, 75.8%
Calm 33.1%
Happy 29.1%
Sad 12.2%
Fear 10.5%
Surprised 7.4%
Angry 4.2%
Confused 3.1%
Disgusted 0.4%

AWS Rekognition

Age 22-34
Gender Male, 87.5%
Calm 97.4%
Happy 1.4%
Surprised 0.6%
Angry 0.3%
Sad 0.2%
Fear 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 22-34
Gender Female, 54.4%
Happy 77.1%
Calm 14.7%
Sad 5.8%
Disgusted 0.8%
Confused 0.7%
Angry 0.5%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people posing for a photo 87.8%
a group of people posing for a photo in front of a window 84.6%
a group of people posing for the camera 84.5%

Text analysis

Amazon

14535
14535.

Google

14535.
14535. 14535. 14535.