Human Generated Data

Title

Untitled (men posed with racing dog)

Date

1972

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11569

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men posed with racing dog)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1972

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.1
Human 99.1
Person 98.9
Person 98.5
Apparel 98.2
Clothing 98.2
Animal 97.4
Pet 97.4
Dog 97.4
Mammal 97.4
Canine 97.4
Person 97.3
Person 97
Person 96.8
Person 93.8
Suit 86.5
Coat 86.5
Overcoat 86.5
Advertisement 75.4
People 74.9
Sailor Suit 71.3
Sleeve 70.9
Poster 70.9
Footwear 69.9
Shoe 69.9
Long Sleeve 68.9
Accessories 67.9
Tie 67.9
Accessory 67.9
Female 63
Shoe 62.5
Shoe 61.2
Photo 60.9
Photography 60.9
Doctor 59.6
Clinic 59.3
Transportation 59
Shorts 58.8
Pants 57.3
Girl 56.4

Imagga
created on 2022-01-15

male 35.5
businessman 34.4
man 34.3
people 34
business 33.4
person 31.9
professional 26.4
corporate 25.8
job 24.8
group 24.2
adult 23.8
office 23.3
men 22.3
work 21.2
executive 19.9
meeting 19.8
women 19.8
team 19.7
teamwork 19.5
happy 19.4
brass 18.8
suit 18.7
success 17.7
manager 17.7
worker 16.4
building 16.1
wind instrument 15.5
together 14.9
modern 14.7
businesswoman 14.5
smiling 14.5
handsome 14.3
couple 13.9
portrait 13.6
two 13.6
smile 13.5
ethnic 13.3
life 12.7
boss 12.4
room 12.4
cornet 12.1
musical instrument 12
lifestyle 11.6
working 11.5
black 11.4
talking 11.4
table 11.2
mature 11.2
employee 11.1
casual 11
lab coat 11
successful 11
silhouette 10.8
conference 10.7
coat 10.7
career 10.4
desk 10.4
planner 10.2
nurse 10.2
communication 10.1
human 9.7
diversity 9.6
standing 9.6
hands 9.6
businesspeople 9.5
company 9.3
patient 9.2
laptop 9.1
stage 9
looking 8.8
discussion 8.8
golfer 8.7
spectator 8.6
walking 8.5
finance 8.4
new 8.1
computer 8.1
player 8
day 7.8
diverse 7.8
architecture 7.8
hall 7.7
sitting 7.7
corporation 7.7
attractive 7.7
partnership 7.7
adults 7.6
light 7.4
clothing 7.3
interior 7.1
happiness 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.6
outdoor 94.7
person 90.6
dog 85.7
clothing 84.4
footwear 65.7
man 65.3
black and white 52

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 99.4%
Sad 46.8%
Calm 23%
Angry 14.5%
Surprised 7.9%
Confused 2.8%
Disgusted 1.8%
Fear 1.6%
Happy 1.6%

AWS Rekognition

Age 51-59
Gender Male, 95.2%
Happy 56.7%
Calm 34.2%
Confused 3.7%
Sad 2.1%
Disgusted 1%
Surprised 0.8%
Angry 0.8%
Fear 0.7%

AWS Rekognition

Age 47-53
Gender Male, 52.3%
Happy 88.4%
Sad 7.1%
Confused 1.7%
Calm 1.5%
Surprised 0.5%
Disgusted 0.3%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 51-59
Gender Male, 97%
Happy 96%
Calm 3.5%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%
Sad 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99.4%
Calm 76.3%
Sad 7.5%
Angry 5.4%
Fear 3.6%
Disgusted 2.8%
Surprised 2%
Happy 1.2%
Confused 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Dog 97.4%
Shoe 69.9%
Tie 67.9%

Captions

Microsoft

a group of people posing for the camera 56.7%
a group of people posing for a picture 56.6%
a group of people standing in front of a sign 56.5%

Text analysis

Amazon

1972
KINGS-QUEENS
SARASOTA
59885.
SHC

Google

1972
59885.
SARASDTA
SI'C
KINGS-QUEENS
59885. SI'C SARASDTA KINGS-QUEENS 1972