Human Generated Data

Title

Untitled (three middle-aged women posed with one woman smoking at table during facy event)

Date

1950

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9288

Human Generated Data

Title

Untitled (three middle-aged women posed with one woman smoking at table during facy event)

People

Artist: Martin Schweig, American 20th century

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 99.3
Person 98.9
Person 96.7
Sitting 96
Apparel 89
Clothing 89
People 78.9
Female 74.9
Overcoat 70.8
Suit 70.8
Coat 70.8
Musical Instrument 64.6
Musician 64.6
Crowd 64.5
Photography 60.1
Photo 60.1
Face 60.1
Portrait 60.1
Woman 59.1
Food 59
Meal 59
Tablecloth 58.3
Performer 57

Imagga
created on 2022-01-23

man 41.7
male 39
business 38.9
businessman 38.9
meeting 33.9
person 33.6
office 33.2
executive 32.9
people 32.4
table 30.3
team 28.7
businesswoman 28.2
professional 27.8
corporate 27.5
group 27.4
men 26.6
wind instrument 26.1
adult 26
teacher 25.8
brass 25.6
work 25.1
colleagues 24.3
communication 23.5
sitting 23.2
happy 23.2
businesspeople 21.8
teamwork 21.3
talking 20.9
musical instrument 20.7
together 20.2
laptop 20.1
room 20
worker 19
women 19
smiling 18.8
couple 18.3
desk 18
suit 17.1
conference 16.6
discussion 16.6
cornet 15.9
job 15.9
manager 15.8
outfit 14.5
smile 14.3
educator 14.1
indoors 14.1
presentation 14
confident 13.7
entrepreneur 13.4
modern 13.3
senior 13.1
mature 13
cheerful 13
coworkers 12.8
home 12.8
handsome 12.5
workplace 12.4
chair 12.3
successful 11.9
associates 11.8
discussing 11.8
conversation 11.7
corporation 11.6
leader 11.6
employee 11.6
30s 11.5
working 11.5
boss 11.5
success 11.3
education 11.3
computer 11.2
indoor 11
diverse 10.8
company 10.2
happiness 10.2
casual 10.2
two 10.2
finance 10.1
oboe 10
board 10
student 9.8
classroom 9.8
30 35 years 9.8
interior 9.7
portrait 9.7
diversity 9.6
ethnic 9.5
career 9.5
coffee 9.3
holding 9.1
life 9.1
hall 9
new 8.9
partners 8.7
busy 8.7
lifestyle 8.7
staff 8.6
adults 8.5
sax 8.4
waiter 8.3
looking 8
seminar 7.9
standing 7.8
colleague 7.8
document 7.8
40s 7.8
full length 7.8
cooperation 7.7
planning 7.7
partnership 7.7
collar 7.7
four 7.7
tie 7.6
plan 7.6
wine 7.4
phone 7.4
20s 7.3
glass 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.2
sitting 97.1
text 95.4
clothing 91.9
group 85.5
woman 85.3
people 81.8
man 63.8
table 56.9

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 65.9%
Happy 80%
Sad 6.9%
Calm 4.6%
Surprised 3.5%
Confused 1.9%
Disgusted 1.2%
Fear 1.1%
Angry 0.9%

AWS Rekognition

Age 45-53
Gender Male, 98%
Sad 84.3%
Calm 6.6%
Happy 4.6%
Angry 2.8%
Confused 0.7%
Disgusted 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 48-54
Gender Male, 99.7%
Happy 99%
Calm 0.6%
Surprised 0.2%
Sad 0.1%
Disgusted 0%
Confused 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people sitting at a table 98.9%
a group of people sitting around a table 98.8%
a group of people sitting on a table 97.8%

Text analysis

Amazon

KODVK-SVELA

Google

YT3RA2-YAGON
YT3RA2-YAGON