Human Generated Data

Title

Untitled (band playing at wedding reception)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10673

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (band playing at wedding reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 98.8
Person 98.7
Person 98.2
Person 96.5
Indoors 96.3
Interior Design 96.3
Person 96.2
Person 93.9
Person 93.6
Person 92.2
Person 91.6
Classroom 86.2
Room 86.2
School 86.2
Crowd 85.8
Tie 85.3
Accessory 85.3
Accessories 85.3
Person 84.5
Leisure Activities 82.6
Person 81.4
Audience 81
Musician 76.9
Musical Instrument 76.9
Person 76.8
Person 76.7
Apparel 76.2
Clothing 76.2
Person 74.7
Person 74.3
People 73.6
Person 70.9
Person 70.6
Face 70.1
Helmet 64.5
Person 62.3
Person 60.4
Guitar 58.6
Performer 57.4
Guitarist 57.4
Coat 56
Suit 56
Overcoat 56

Imagga
created on 2022-01-15

people 36.2
male 31.9
man 30.9
meeting 30.1
person 28.6
room 27.8
group 25
office 24.7
businessman 23.8
business 22.4
together 21.9
couple 21.8
brass 20.9
team 20.6
home 19.9
teamwork 19.4
table 19
work 18.9
adult 18.6
indoors 18.4
happy 18.1
teacher 18.1
classroom 18.1
men 18
wind instrument 17.6
professional 17.1
smiling 16.6
businesswoman 16.3
corporate 16.3
sitting 16.3
communication 15.9
senior 15.9
women 15.8
executive 15.2
desk 15.1
modern 14.7
lifestyle 14.4
musical instrument 14.4
talking 14.2
businesspeople 14.2
suit 13.5
happiness 13.3
job 13.3
nurse 13.1
education 12.1
indoor 11.9
laptop 11.8
conference 11.7
colleagues 11.6
worker 11.6
interior 11.5
mature 11.1
chair 10.6
success 10.4
life 10.4
manager 10.2
cheerful 9.7
discussion 9.7
working 9.7
educator 9.4
presentation 9.3
smile 9.3
cornet 9.2
portrait 9
student 8.9
hall 8.9
handsome 8.9
coworkers 8.8
looking 8.8
love 8.7
clothing 8.7
ethnic 8.6
camera 8.3
occupation 8.2
successful 8.2
computer 8
half length 7.8
40s 7.8
two people 7.8
mid adult 7.7
elderly 7.6
boss 7.6
workplace 7.6
finance 7.6
togetherness 7.5
kin 7.5
friendship 7.5
school 7.4
confident 7.3
color 7.2
family 7.1

Google
created on 2022-01-15

Black 89.8
Black-and-white 84.7
Style 83.9
Hat 76
Monochrome 74.5
Monochrome photography 74.3
Font 71.9
Event 71.9
Crew 71.5
Room 70.8
Suit 68.7
Vintage clothing 68.1
Art 65.2
Photo caption 64.4
History 64.3
Stock photography 63.9
Team 63.1
Illustration 57.5
T-shirt 55.8
Uniform 50.2

Microsoft
created on 2022-01-15

person 98.5
text 97.4
group 58.4
people 57.8
posing 52.5

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 82.4%
Calm 99.3%
Surprised 0.2%
Confused 0.1%
Angry 0.1%
Sad 0.1%
Disgusted 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 43-51
Gender Male, 77.9%
Sad 38.6%
Calm 33.4%
Happy 22.1%
Confused 1.5%
Surprised 1.5%
Fear 1%
Disgusted 0.9%
Angry 0.9%

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Sad 64.6%
Calm 32.7%
Confused 0.9%
Angry 0.7%
Fear 0.3%
Disgusted 0.3%
Happy 0.2%
Surprised 0.2%

AWS Rekognition

Age 45-51
Gender Male, 85%
Confused 55.3%
Sad 23.6%
Calm 13.2%
Happy 2.4%
Surprised 2.1%
Fear 1.5%
Disgusted 1%
Angry 1%

AWS Rekognition

Age 50-58
Gender Male, 81.8%
Calm 65.9%
Surprised 9.1%
Confused 8.5%
Sad 7.8%
Fear 4.9%
Happy 1.4%
Angry 1.3%
Disgusted 1.1%

AWS Rekognition

Age 54-64
Gender Female, 67.9%
Calm 96.8%
Happy 1.1%
Fear 0.7%
Confused 0.4%
Angry 0.3%
Disgusted 0.3%
Sad 0.3%
Surprised 0.2%

AWS Rekognition

Age 41-49
Gender Male, 99.2%
Surprised 26.5%
Calm 25.8%
Sad 18.9%
Happy 7.3%
Angry 6.5%
Fear 6%
Confused 4.9%
Disgusted 4%

AWS Rekognition

Age 43-51
Gender Male, 94.5%
Calm 99.3%
Sad 0.6%
Confused 0%
Disgusted 0%
Happy 0%
Angry 0%
Fear 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Tie 85.3%
Helmet 64.5%

Captions

Microsoft

a group of people posing for a photo 95.6%
a group of people posing for the camera 95.5%
a group of people posing for a picture 95.4%

Text analysis

Amazon

2
21330.

Google

EI
Z
•OE EI Z 21330.
21330.
•OE