Human Generated Data

Title

Untitled (wedding guests at a table under a tent)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8526

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests at a table under a tent)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8526

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.5
Person 99.4
Person 99.3
Person 99.1
Person 99
Person 98.5
Person 97.5
Person 94.6
Clothing 94.2
Apparel 94.2
Person 92.3
Crowd 87.8
Suit 87.6
Overcoat 87.6
Coat 87.6
Musical Instrument 83.1
Musician 83.1
People 77.3
Meal 76.1
Food 76.1
Accessories 73.1
Accessory 73.1
Sunglasses 73.1
Tie 65.3
Leisure Activities 63.1
Tuxedo 59.6
Silhouette 58.5
Music Band 58.3
Plant 57.7
Female 55
Person 45.8

Clarifai
created on 2023-10-25

people 100
group 99.7
many 98.8
group together 98.3
adult 97.9
man 97.8
leader 97
several 94.9
woman 94.3
administration 91.1
recreation 87.7
child 86.3
wear 85.6
elderly 82.8
ceremony 81.6
five 81.2
actor 80.4
chair 80.3
music 79.1
clergy 78.2

Imagga
created on 2022-01-09

man 47.7
male 42.5
businessman 37.1
business 35.2
people 31.8
person 29.4
men 28.3
group 26.6
office 25.8
meeting 25.4
adult 25
executive 24.7
happy 23.2
team 22.4
colleagues 22.3
senior 20.6
couple 20
businesspeople 19.9
work 19.6
businesswoman 19.1
corporate 18.9
smiling 18.1
sitting 18
worker 17.5
secretary 17.2
talking 17.1
mature 16.7
table 16.4
job 15.9
together 15.8
coworkers 15.7
discussion 15.6
casual 15.2
women 15
indoors 14.9
teamwork 14.8
desk 14.2
professional 13.6
room 13.6
portrait 13.6
old 13.2
lifestyle 13
smile 12.8
20s 12.8
suit 12.6
handsome 12.5
working 12.4
manager 12.1
camera 12
looking 12
associates 11.8
conference 11.7
40s 11.7
30s 11.5
cheerful 11.4
life 11.1
happiness 11
laptop 10.9
discussing 10.8
retired 10.7
mid adult 10.6
spectator 10.5
technology 10.4
day 10.2
building 9.8
businessmen 9.7
busy 9.6
four 9.6
elderly 9.6
home 9.6
engineer 9.3
company 9.3
horizontal 9.2
color 8.9
boardroom 8.9
four people 8.9
collaboration 8.9
success 8.8
middle aged 8.8
older 8.7
paper 8.6
serious 8.6
face 8.5
adults 8.5
plan 8.5
friends 8.4
communication 8.4
presentation 8.4
new 8.1
computer 8
musical instrument 7.9
classroom 7.9
love 7.9
crew 7.9
conversation 7.8
cooperation 7.7
employee 7.7
drink 7.5
successful 7.3
indoor 7.3
aged 7.2
architecture 7
modern 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.7
text 97.7
clothing 96.9
man 94.5
outdoor 90.8
black and white 77.6
people 73.6
old 51.1
crowd 2.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Male, 99.7%
Surprised 51.1%
Calm 28.2%
Happy 15.5%
Sad 2.8%
Angry 0.6%
Fear 0.6%
Disgusted 0.6%
Confused 0.5%

AWS Rekognition

Age 33-41
Gender Female, 50.7%
Calm 99.1%
Sad 0.9%
Angry 0%
Happy 0%
Fear 0%
Confused 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 31-41
Gender Male, 98.8%
Calm 99.1%
Surprised 0.5%
Happy 0.1%
Confused 0.1%
Sad 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Male, 96.9%
Happy 82.1%
Calm 8.3%
Sad 3.9%
Confused 2.5%
Fear 1.4%
Surprised 0.9%
Disgusted 0.7%
Angry 0.3%

AWS Rekognition

Age 48-54
Gender Female, 89.6%
Calm 39.2%
Sad 24.1%
Fear 19.8%
Disgusted 8.6%
Confused 2.5%
Happy 2.5%
Surprised 1.9%
Angry 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Sunglasses 73.1%
Tie 65.3%

Text analysis

Amazon

17536
19536.
VINTNERS

Google

17536 NERS 19536.
17536
NERS
19536.