Human Generated Data

Title

Untitled (couple dancing)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4945

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple dancing)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 100
Apparel 100
Person 98.9
Human 98.9
Person 98.6
Coat 98.2
Overcoat 98.2
Suit 98.2
Person 98.1
Person 96
Dress 94.2
Person 93.7
Robe 89.6
Fashion 89.6
Gown 89.4
Sleeve 88.7
Wedding 88.3
Bridegroom 87.6
Female 87
Shirt 85.4
Long Sleeve 84.7
Tuxedo 79.5
Woman 74.9
Wedding Gown 74.8
Bride 65.9
Man 63.7
Face 62.9
Dating 56.5

Imagga
created on 2022-01-23

person 34.9
man 31.8
adult 30
male 29.1
people 25.6
business 23.1
happy 22.5
smiling 21
professional 20.9
portrait 19.4
senior 18.7
office 17.8
attractive 17.5
holding 17.3
lifestyle 15.9
businessman 15.9
mature 15.8
work 15.8
laptop 15.7
indoors 14.9
old 14.6
computer 14.6
businesswoman 13.6
face 13.5
corporate 12.9
sitting 12.9
clothing 12.5
couple 12.2
executive 11.9
alone 11.9
casual 11.9
handsome 11.6
elderly 11.5
smile 11.4
businesspeople 11.4
success 11.3
modern 11.2
looking 11.2
men 11.2
love 11
indoor 11
confident 10.9
job 10.6
one 10.4
technology 10.4
black 10.3
day 10.2
pretty 9.8
1 9.6
together 9.6
student 9.5
manager 9.3
20s 9.2
jacket 9
cheerful 8.9
lady 8.9
working 8.8
older 8.7
shirt 8.7
education 8.7
model 8.6
color 8.3
glasses 8.3
fashion 8.3
human 8.2
successful 8.2
planner 8
home 8
hair 7.9
gray hair 7.9
good mood 7.8
expression 7.7
workplace 7.6
age 7.6
hand 7.6
desk 7.6
suit 7.6
leisure 7.5
wagon 7.3
dress 7.2
worker 7.2
medical 7.1
happiness 7
statue 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.3
person 93.6
human face 92.9
clothing 90.3
standing 90
sketch 86.9
man 72.3
drawing 61.2

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 77.5%
Happy 50.8%
Calm 26.8%
Disgusted 9.4%
Angry 4.9%
Confused 4.6%
Surprised 1.7%
Sad 1.2%
Fear 0.7%

AWS Rekognition

Age 34-42
Gender Female, 95.6%
Happy 94.6%
Calm 1.6%
Sad 0.8%
Surprised 0.8%
Fear 0.7%
Disgusted 0.6%
Angry 0.4%
Confused 0.4%

AWS Rekognition

Age 16-24
Gender Female, 98%
Calm 91.4%
Happy 3.8%
Confused 1.7%
Surprised 0.9%
Angry 0.7%
Fear 0.6%
Disgusted 0.5%
Sad 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a group of people standing in a room 92.9%
a group of people standing next to a person 82.7%
a group of people posing for a photo 82.6%

Text analysis

Amazon

12653
12653.
RODON

Google

12653.
12653. 12653.