Human Generated Data

Title

Untitled (woman and boy in chairs in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16336

Human Generated Data

Title

Untitled (woman and boy in chairs in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16336

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.2
Human 99.2
Tie 98.6
Accessories 98.6
Accessory 98.6
Person 97.5
Person 96.6
Table Lamp 94.8
Clothing 94.4
Apparel 94.4
Lamp 94.3
Person 93.4
Furniture 92
Suit 90.9
Overcoat 90.9
Coat 90.9
Person 89.8
Chair 82
Tuxedo 65.3
Portrait 65
Photography 65
Photo 65
Face 65
Plant 64.4
People 61.7
Cushion 59.8
Flower 59.6
Blossom 59.6
Table 59
Living Room 58.8
Indoors 58.8
Room 58.8
Sitting 56.4
City 56.2
Building 56.2
Urban 56.2
Town 56.2
Shirt 55.8

Clarifai
created on 2023-10-28

people 99.8
group 98.4
adult 97.8
woman 96.6
man 96.3
three 95.9
two 94.2
group together 93
four 92.3
sit 91.2
chair 90.9
facial expression 89.5
uniform 87.4
wear 85.2
leader 84.8
medical practitioner 84.5
child 84.5
several 83.6
military 81.1
outfit 77.5

Imagga
created on 2022-02-11

man 45
nurse 40.9
person 40.6
male 39
people 36.3
professional 35.2
adult 34.1
doctor 31.9
medical 30.9
businessman 29.1
business 28.5
coat 27.8
patient 25.6
worker 25.2
team 25.1
job 24.8
smiling 23.9
happy 23.8
hospital 23.5
office 23.3
mature 23.2
men 23.2
work 22.8
colleagues 21.4
health 20.1
looking 20
life 19.5
portrait 19.4
medicine 19.4
clinic 19.3
indoors 19.3
lab coat 18.9
group 18.5
businesspeople 18
senior 17.8
working 16.8
teamwork 16.7
kin 16.6
care 16.5
smile 16.4
sitting 15.5
desk 15.1
table 14.7
occupation 14.7
businesswoman 14.5
stethoscope 14.4
30s 14.4
casual 14.4
elderly 14.4
handsome 14.3
meeting 14.1
corporate 13.7
20s 13.7
indoor 13.7
sax 13.5
suit 13.5
women 13.4
day 13.3
practitioner 13
camera 12.9
human 12.7
lab 12.6
lifestyle 12.3
modern 11.9
case 11.9
two 11.9
room 11.9
associates 11.8
laboratory 11.6
profession 11.5
bright 11.4
talking 11.4
cheerful 11.4
face 11.4
couple 11.3
color 11.1
laptop 10.9
doctors 10.8
waiter 10.8
clinical 10.7
physician 10.7
40s 10.7
light 10.7
assistant 10.7
wind instrument 10.6
together 10.5
old 10.5
computer 10.4
home 10.4
brass 10.1
employee 10.1
surgeon 10
uniform 10
clothing 9.9
staff 9.6
education 9.5
research 9.5
cornet 9.5
treatment 9.2
student 9.2
sick person 9.2
specialist 9.2
confident 9.1
technology 8.9
coworkers 8.8
older 8.7
chemical 8.7
brunette 8.7
chemistry 8.7
standing 8.7
mid adult 8.7
dining-room attendant 8.7
four 8.6
executive 8.6
serious 8.6
successful 8.2
science 8
writing 8
business people 7.9
discussing 7.9
forties 7.8
happiness 7.8
50s 7.8
scientist 7.8
consultant 7.8
discussion 7.8
middle aged 7.8
cooperation 7.7
adults 7.6
horizontal 7.5
success 7.2
aged 7.2

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

person 98.4
man 96.9
text 89.9
wedding dress 85.9
clothing 83.6
wedding 80.7
bride 75.5
vase 56.5
posing 55.7
old 46.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 94.9%
Surprised 62.3%
Happy 21.1%
Sad 10.7%
Disgusted 1.5%
Confused 1.5%
Calm 1.4%
Fear 1%
Angry 0.5%

AWS Rekognition

Age 49-57
Gender Female, 95.5%
Calm 59%
Sad 21.5%
Happy 12.7%
Confused 2.8%
Fear 1.7%
Angry 0.9%
Disgusted 0.8%
Surprised 0.6%

AWS Rekognition

Age 38-46
Gender Female, 60%
Happy 94.7%
Surprised 1.7%
Calm 1.4%
Sad 0.8%
Confused 0.6%
Fear 0.3%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 41-49
Gender Male, 76.7%
Calm 64.6%
Happy 31.1%
Confused 1.6%
Sad 1.2%
Fear 0.5%
Disgusted 0.4%
Angry 0.4%
Surprised 0.2%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Calm 72.4%
Happy 12.4%
Confused 8.7%
Sad 2%
Disgusted 1.6%
Angry 1.4%
Surprised 1%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Lamp
Person 99.2%
Person 97.5%
Person 96.6%
Person 93.4%
Person 89.8%
Tie 98.6%
Lamp 94.3%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

62.
are