Human Generated Data

Title

Untitled (man and two women in front of buffet with candles)

Date

1940

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19505

Human Generated Data

Title

Untitled (man and two women in front of buffet with candles)

People

Artist: Samuel Cooper, American active 1950s

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19505

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Accessories 99.9
Accessory 99.9
Tie 99.9
Apparel 99.8
Clothing 99.8
Human 99.2
Person 99.2
Person 97.8
Person 97.1
Coat 79.9
Sleeve 75.9
Text 73.2
Glass 72.7
Food 66.3
Meal 66.3
Suit 62.3
Overcoat 62.3
Finger 62.3
Dish 61.9
Creme 61.3
Cake 61.3
Cream 61.3
Dessert 61.3
Icing 61.3
Plant 60.3
Flower 60.3
Blossom 60.3
Home Decor 59.5
Linen 57.9

Clarifai
created on 2019-10-29

people 99.7
man 97.6
portrait 97.4
adult 97.2
facial expression 96.3
group 96
leader 94.4
four 93.3
wear 92.5
three 92.2
woman 91.7
administration 91.4
two 91.3
musician 90.7
group together 90.4
music 88.6
veil 87.2
one 87
five 86.8
several 85.4

Imagga
created on 2019-10-29

man 51.1
male 46.1
person 43.6
doctor 42.3
professional 39
adult 34.3
medical 33.6
hospital 32
coat 31.8
office 30.5
stethoscope 30.2
people 29.6
lab coat 28.5
businessman 28.3
job 27.4
business 26.1
handsome 25.9
happy 25.7
medicine 25.6
portrait 25.3
mature 24.2
looking 23.2
smiling 23.2
occupation 22.9
team 22.4
men 22.3
health 22.2
clinic 21.1
colleagues 19.4
nurse 19.4
smile 19.3
care 18.9
work 18.8
indoors 18.5
businesspeople 18
camera 17.6
worker 17.3
confident 17.3
uniform 17.3
patient 16.7
elderly 16.3
corporate 15.5
profession 15.3
casual 15.3
bow tie 15
senior 15
standing 14.8
indoor 14.6
businesswoman 14.6
practitioner 14.5
lifestyle 14.5
staff 14.4
necktie 14.2
attractive 14
couple 13.9
garment 13.8
group 13.7
success 13.7
laptop 13.7
meeting 13.2
teamwork 13
20s 12.8
face 12.1
friendly 11.9
day 11.8
executive 11.8
expertise 11.7
suit 11.6
30s 11.6
bright 11.4
desk 11.3
human 11.3
clothing 11.1
happiness 11
horizontal 10.9
student 10.6
working 10.6
cheerful 10.6
computer 10.4
women 10.3
employee 10.3
two 10.2
successful 10.1
associates 9.8
doctors 9.8
kin 9.8
surgeon 9.8
physician 9.8
teacher 9.7
one 9.7
confidence 9.6
color 9.5
sitting 9.5
aged 9.1
good mood 8.8
eye contact 8.8
clinical 8.8
40s 8.8
middle aged 8.8
health care 8.7
education 8.7
partnership 8.6
exam 8.6
illness 8.6
planner 8.6
speaker 8.5
modern 8.4
old 8.4
specialist 8.3
holding 8.3
healthy 8.2
light 8
home 8
building 7.9
guy 7.9
crew 7.9
coworkers 7.9
diagnosis 7.8
consultant 7.8
discussion 7.8
cooperation 7.7
busy 7.7
formal 7.6
talking 7.6
tie 7.6
manager 7.5
room 7.4
black 7.2
hair 7.1
life 7

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

wall 97.8
posing 94.2
text 86.1
man 82.2
person 82.1
human face 77
candle 70.1
suit 65.2
group 63
clothing 59.1
old 55.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-52
Gender Male, 59.9%
Happy 7%
Angry 5.7%
Disgusted 1.3%
Calm 4.7%
Sad 3.8%
Confused 2.7%
Fear 17.9%
Surprised 56.9%

AWS Rekognition

Age 26-40
Gender Male, 98.2%
Happy 0.3%
Calm 0.1%
Angry 1.2%
Surprised 76.5%
Sad 0%
Confused 0.3%
Disgusted 0.3%
Fear 21.3%

AWS Rekognition

Age 43-61
Gender Female, 89.6%
Calm 14%
Confused 1%
Sad 9%
Angry 4.7%
Surprised 1.7%
Disgusted 1.3%
Happy 66.7%
Fear 1.6%

Feature analysis

Amazon

Tie 99.9%
Person 99.2%
Suit 62.3%

Categories

Text analysis

Amazon

SI
223190A9939U2A13A

Google

JS г23яя ИАЯЯ З9U2 АТЭА
JS
г23яя
ИАЯЯ
З9U2
АТЭА