Human Generated Data

Title

Untitled (guests eating at reception)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.429.20

Human Generated Data

Title

Untitled (guests eating at reception)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.429.20

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Restaurant 99.6
Person 99
Human 99
Person 98.7
Sitting 98.4
Person 98.1
Person 97.7
Person 95.5
Person 93.5
Food 93.4
Food Court 93.4
Meal 91.2
Person 87.7
Cafeteria 86.7
Furniture 84.2
Cafe 81.3
Table 81.1
Chair 75.3
Person 75
Person 73.2
Chair 70.3
Dining Table 68.4
Couch 63.9
People 63.9
Dish 63.8
Person 63.3
Suit 63.2
Overcoat 63.2
Coat 63.2
Apparel 63.2
Clothing 63.2

Clarifai
created on 2019-03-25

people 99.9
group together 99.3
group 99.2
adult 97.9
man 96.9
woman 96.1
several 96
four 95.1
administration 95
leader 94.8
many 91.5
five 91.5
two 87.9
three 87.5
furniture 86.6
sit 86.5
facial expression 85
actress 84.9
recreation 79.2
chair 78.1

Imagga
created on 2019-03-25

teacher 52.8
man 39
adult 37.5
people 36.8
male 36.3
person 36.1
together 33.3
educator 31.5
indoors 30.8
home 28.7
professional 28
couple 27
classroom 26.7
happy 26.3
smiling 26.1
senior 25.3
group 25
meeting 24.5
table 22.5
sitting 22.3
businessman 22.1
room 21.9
business 21.9
laptop 21.1
mature 20.5
businesswoman 20
talking 20
men 19.8
office 19.5
cheerful 18.7
computer 18.5
businesspeople 17.1
desk 17
lifestyle 16.6
women 16.6
colleagues 16.5
smile 15.7
team 15.2
teamwork 14.8
education 14.7
work 14.1
friends 14.1
20s 13.8
discussion 13.6
student 13.6
looking 13.6
30s 13.5
elderly 13.4
family 13.3
happiness 13.3
executive 13.2
learning 13.2
discussing 12.8
two 12.7
retired 12.6
working 12.4
corporate 12
successful 11.9
indoor 11.9
mid adult 11.6
school 11.5
husband 11.5
success 11.3
attractive 11.2
love 11.1
two people 10.7
class 10.6
child 10.6
boy 10.4
coffee 10.2
mother 10.1
horizontal 10.1
drink 10
face 10
old 9.8
teaching 9.7
middle aged 9.7
waiter 9.7
retirement 9.6
wife 9.5
casual 9.3
presentation 9.3
eating 9.3
children 9.1
musical instrument 9.1
cup 9
worker 9
suit 9
color 8.9
to 8.9
coworkers 8.8
students 8.8
drinking 8.6
portrait 8.4
camera 8.3
meal 8.2
technology 8.2
director 8.2
job 8
brunette 7.8
businessmen 7.8
40s 7.8
busy 7.7
modern 7.7
workplace 7.6
reading 7.6
females 7.6
communication 7.6
togetherness 7.6
kin 7.5
grandfather 7.5
study 7.5
holding 7.4
parent 7.4
food 7.3
blond 7.2
handsome 7.1
kid 7.1

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

person 99.8
indoor 94.4
group 77.2
people 72.8
competition 12
black and white 11.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-44
Gender Female, 95.2%
Calm 0.8%
Happy 90.7%
Confused 0.9%
Disgusted 1.3%
Sad 2%
Surprised 3.3%
Angry 1%

AWS Rekognition

Age 19-36
Gender Male, 74.6%
Calm 24.8%
Sad 50%
Disgusted 2.4%
Happy 3.1%
Confused 10%
Angry 6.8%
Surprised 2.9%

AWS Rekognition

Age 57-77
Gender Female, 99.7%
Sad 2.3%
Surprised 2.5%
Calm 0.8%
Confused 1.7%
Happy 89.2%
Disgusted 1.8%
Angry 1.8%

AWS Rekognition

Age 20-38
Gender Female, 54.9%
Confused 45.6%
Happy 46.1%
Sad 46%
Surprised 45.2%
Angry 45.1%
Disgusted 45.1%
Calm 51.9%

AWS Rekognition

Age 12-22
Gender Female, 61.1%
Angry 22.6%
Sad 18.5%
Happy 8.6%
Calm 11.6%
Surprised 6.3%
Confused 8.1%
Disgusted 24.5%

AWS Rekognition

Age 35-52
Gender Female, 52.9%
Angry 45.9%
Surprised 45.2%
Sad 52.4%
Calm 45.7%
Confused 45.2%
Happy 45.3%
Disgusted 45.2%

AWS Rekognition

Age 38-59
Gender Female, 51.4%
Disgusted 45.5%
Angry 45.5%
Sad 52.7%
Happy 45.1%
Calm 45.6%
Confused 45.4%
Surprised 45.2%

AWS Rekognition

Age 45-66
Gender Male, 51.6%
Happy 53.5%
Confused 45.1%
Sad 45.5%
Calm 45.1%
Surprised 45.2%
Angry 45.3%
Disgusted 45.3%

AWS Rekognition

Age 48-68
Gender Male, 50.4%
Angry 49.5%
Sad 49.6%
Surprised 49.5%
Confused 49.5%
Calm 50.1%
Happy 49.6%
Disgusted 49.6%

AWS Rekognition

Age 35-53
Gender Female, 50.1%
Happy 49.7%
Disgusted 49.6%
Confused 49.5%
Surprised 49.6%
Calm 49.7%
Sad 49.8%
Angry 49.6%

AWS Rekognition

Age 48-68
Gender Male, 50.6%
Happy 45.8%
Sad 47.2%
Confused 45.4%
Angry 45.5%
Disgusted 48.8%
Calm 46.9%
Surprised 45.4%

Microsoft Cognitive Services

Age 55
Gender Female

Microsoft Cognitive Services

Age 48
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Chair 75.3%
Suit 63.2%

Categories

Imagga

people portraits 99.3%

Text analysis

Amazon

MARTIN
MARTIN SCHWEIC
SCHWEIC
SAINT
LOUIS
SAINT LOUIS SAINT LOUIS
SAINT LOUIS

Google

MARTIN SCHWEİC SAINT LOUIS
MARTIN
SCHWEİC
SAINT
LOUIS