Human Generated Data

Title

Untitled (group of men standing behind group of women seated at dinner table)

Date

1939

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4032

Human Generated Data

Title

Untitled (group of men standing behind group of women seated at dinner table)

People

Artist: Durette Studio, American 20th century

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4032

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.7
Person 99.7
Person 99.1
Meal 98.6
Food 98.6
Person 98.4
Person 97.7
Person 97
Person 96.7
Person 96.6
Person 96.4
Person 92.7
Furniture 91
Table 91
Person 90.1
Person 89.9
Dish 87.7
Person 85.7
Person 84.9
People 84.1
Restaurant 83.9
Person 83.3
Person 83.1
Tablecloth 82.2
Person 75.6
Indoors 71.5
Cafeteria 69.5
Room 67.5
Photography 62.7
Face 62.7
Portrait 62.7
Photo 62.7
Buffet 58.5
Dining Table 57.3
Dining Room 55.7

Clarifai
created on 2019-06-01

people 99.9
group 97.9
woman 97
adult 96.8
man 95.6
veil 94.9
group together 94.3
many 92.8
wedding 92.2
sit 90.8
child 88.4
chair 87.1
furniture 85.4
several 83.5
wear 82.8
room 81.5
indoors 81
monochrome 80.3
leader 80.1
banquet 77.3

Imagga
created on 2019-06-01

people 33.5
barbershop 33.1
man 32.9
person 32.7
salon 29.3
male 29.1
adult 28.7
shop 27.3
nurse 26
professional 24.5
room 22.6
smiling 22.4
indoors 21.1
mercantile establishment 21
couple 20.9
sitting 19.7
men 19.7
happy 19.4
home 19.1
women 18.2
work 18.1
patient 17.9
worker 17.4
table 17.3
office 17
medical 16.8
business 16.4
smile 16.4
businessman 15.9
businesspeople 15.2
doctor 15
hospital 14.7
cheerful 14.6
teacher 14.5
two 14.4
team 14.3
place of business 14
teamwork 13.9
lifestyle 13.7
clinic 13.6
colleagues 13.6
working 13.3
together 13.1
coat 13
portrait 12.9
waiter 12.7
happiness 12.5
group 12.1
mature 12.1
computer 12
20s 11.9
indoor 11.9
businesswoman 11.8
health 11.8
day 11.8
40s 11.7
talking 11.4
life 11.1
occupation 11
family 10.7
face 10.6
interior 10.6
laboratory 10.6
medicine 10.6
desk 10.4
senior 10.3
educator 10.3
clothing 10.2
casual 10.2
holding 9.9
employee 9.9
specialist 9.8
job 9.7
two people 9.7
lab 9.7
student 9.7
mid adult 9.6
restaurant 9.6
30s 9.6
meeting 9.4
chair 9.3
modern 9.1
color 8.9
celebration 8.8
love 8.7
education 8.7
bed 8.5
lab coat 8.5
instrument 8.4
care 8.2
groom 8.2
laptop 8.2
to 8
associates 7.9
scientist 7.8
casual clothing 7.8
middle aged 7.8
corporate 7.7
test 7.7
attractive 7.7
old 7.7
enjoying 7.6
adults 7.6
communication 7.6
drink 7.5
human 7.5
clothes 7.5
technology 7.4
service 7.4
new 7.3
looking 7.2
handsome 7.1
romantic 7.1
science 7.1
mother 7
dining-room attendant 7

Google
created on 2019-06-01

Photograph 96.8
Snapshot 82.5
Black-and-white 68.3
Event 67
Room 65.7
Rehearsal dinner 65.7
Table 63.8
Photography 62.4
Team 59.4
Family 56.6

Microsoft
created on 2019-06-01

table 95.8
person 95.4
clothing 93.8
tableware 89.8
indoor 89.2
old 83.7
man 82.5
human face 81.7
food 79
woman 74.5
bottle 70.3
group 57.5
smile 54.9
posing 52.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.4%
Sad 46%
Angry 45.8%
Happy 50%
Confused 45.6%
Calm 46.7%
Surprised 45.6%
Disgusted 45.3%

AWS Rekognition

Age 48-68
Gender Male, 53.9%
Happy 46.7%
Surprised 45.5%
Calm 50.6%
Sad 45.9%
Disgusted 45.5%
Angry 45.4%
Confused 45.4%

AWS Rekognition

Age 20-38
Gender Female, 50.8%
Disgusted 45.6%
Happy 47.5%
Surprised 45.7%
Sad 47.6%
Angry 45.4%
Confused 45.5%
Calm 47.8%

AWS Rekognition

Age 26-43
Gender Female, 50.6%
Disgusted 45.4%
Confused 45.2%
Surprised 45.3%
Calm 49.1%
Happy 45.4%
Angry 45.3%
Sad 49.2%

AWS Rekognition

Age 26-43
Gender Female, 50.7%
Confused 45.4%
Calm 45.5%
Sad 52.6%
Surprised 45.3%
Angry 45.3%
Disgusted 45.5%
Happy 45.4%

AWS Rekognition

Age 26-43
Gender Female, 52%
Disgusted 45.3%
Calm 48.8%
Confused 45.6%
Sad 48.1%
Surprised 45.5%
Angry 45.4%
Happy 46.3%

AWS Rekognition

Age 29-45
Gender Female, 52.2%
Confused 45.4%
Sad 45.8%
Calm 46.9%
Happy 51.3%
Surprised 45.3%
Disgusted 45.1%
Angry 45.3%

AWS Rekognition

Age 20-38
Gender Female, 54%
Angry 45.2%
Sad 45.3%
Happy 53.6%
Calm 45.4%
Confused 45.2%
Disgusted 45.1%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Female, 54%
Happy 53.1%
Confused 45.3%
Angry 45.2%
Disgusted 45.1%
Surprised 45.2%
Sad 45.8%
Calm 45.2%

AWS Rekognition

Age 26-43
Gender Female, 51.2%
Angry 45.5%
Happy 46.4%
Confused 45.7%
Sad 48.6%
Calm 47.8%
Surprised 45.6%
Disgusted 45.4%

AWS Rekognition

Age 20-38
Gender Male, 52.6%
Disgusted 45.3%
Surprised 45.6%
Angry 45.4%
Confused 45.6%
Sad 48.6%
Calm 49%
Happy 45.6%

Feature analysis

Amazon

Person 99.7%
Dining Table 57.3%

Categories

Text analysis

Amazon

AGENCY
TESSIER
RG TESSIER AGENCY
RG

Google

&TESSIER AGENCY INSURANCE OF ALL KINDS 15
&TESSIER
AGENCY
INSURANCE
OF
ALL
KINDS
15