Human Generated Data

Title

Untitled (girls having makeup done)

Date

1937

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22328

Human Generated Data

Title

Untitled (girls having makeup done)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22328

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 98.9
Human 98.9
Person 98.8
Person 98.6
Person 98.1
Meal 89.8
Food 89.8
Restaurant 84.3
Clothing 82.6
Apparel 82.6
Indoors 70.7
Sleeve 67.7
Cafeteria 67.5
Sitting 67.2
Room 66.4
People 62.1
Dish 59.8
Female 55.6
Plant 55.3
Photography 55
Photo 55
Person 54.8

Clarifai
created on 2023-10-22

people 100
group 98.8
group together 98.7
woman 98.3
adult 98
monochrome 97.7
room 96.8
furniture 96
child 95.9
man 94.8
boy 94.2
administration 90.9
four 90.7
three 90.2
two 89.1
offspring 88.8
actress 88.4
five 86.8
dining room 86.7
several 86.5

Imagga
created on 2022-03-11

salon 76.2
shop 27.1
people 24.5
man 24.2
table 23.7
interior 22.1
kitchen 19.6
indoors 19.3
room 18.8
person 18.2
home 17.5
barbershop 17.2
male 17.1
restaurant 16.1
mercantile establishment 15.7
glass 14.2
happy 13.8
men 13.7
luxury 13.7
lifestyle 13
adult 12.4
business 12.1
chair 12.1
window 12.1
indoor 11.9
style 11.9
fashion 11.3
modern 11.2
party 11.2
furniture 11.1
house 10.9
place of business 10.6
cooking 10.5
dinner 10.3
black 10.2
work 10.2
happiness 10.2
decoration 10.1
smiling 10.1
smile 10
pretty 9.8
family 9.8
food 9.7
dining 9.5
women 9.5
sitting 9.4
cook 9.1
attractive 9.1
cheerful 8.9
new 8.9
decor 8.8
working 8.8
couple 8.7
elegance 8.4
hairdresser 8.3
inside 8.3
case 8.2
life 8
portrait 7.8
elegant 7.7
setting 7.7
knife 7.7
counter 7.7
building 7.4
service 7.4
glasses 7.4
retro 7.4
celebration 7.2
stove 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

person 98.8
indoor 95.9
kitchen 95.5
clothing 83.9
black and white 71.4
table 68.8
text 63.3
counter 52.9
preparing 50.3
cooking 31.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 94.7%
Calm 73.8%
Surprised 13.1%
Happy 9.9%
Confused 1.5%
Disgusted 0.9%
Fear 0.4%
Sad 0.3%
Angry 0.2%

AWS Rekognition

Age 21-29
Gender Female, 94.2%
Calm 99.7%
Sad 0.2%
Happy 0.1%
Angry 0%
Surprised 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 18-26
Gender Male, 84%
Calm 100%
Sad 0%
Surprised 0%
Happy 0%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.9%
Person 98.8%
Person 98.6%
Person 98.1%
Person 54.8%

Text analysis

Amazon

OLD
the
the to
to
MEANS
TREATRIGHT
one