Human Generated Data

Title

Untitled (Mappila [Moplah] man and woman)

Date

c.1860-1880

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.52

Human Generated Data

Title

Untitled (Mappila [Moplah] man and woman)

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Date

c.1860-1880

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Apparel 100
Clothing 100
Human 99.5
Person 99.5
Person 99.5
Hat 94.3
Sombrero 94.2
Shorts 82.2
Sun Hat 80
Back 73.1
Undershirt 56.7

Clarifai
created on 2018-10-18

people 99.9
adult 99.2
wear 98.7
two 97.9
woman 96.8
portrait 95.2
one 95.2
man 92.1
group 87.2
outfit 86.6
three 86.3
umbrella 84.9
veil 84.6
lid 84.1
retro 83.7
child 80.9
military 80.8
music 79.3
vehicle 75.7
group together 75.2

Imagga
created on 2018-10-18

hat 100
sombrero 100
headdress 100
clothing 68.6
cowboy hat 58.5
covering 36.1
consumer goods 34.4
person 25
man 24.2
portrait 23.9
model 22.5
people 22.3
male 22
adult 20
cowboy 19.8
black 18
fashion 17.3
posing 16.9
sexy 16.9
hair 15.8
human 15.7
guy 14.7
attractive 14.7
old 13.9
lifestyle 13.7
body 13.6
one 13.4
umbrella 13.2
boy 13
face 12.8
western 12.6
handsome 12.5
hand 12.1
pose 11.8
cute 11.5
smile 11.4
lady 11.4
fun 11.2
shirt 11.2
looking 11.2
child 10.9
happy 10.7
pretty 10.5
costume 10.1
style 9.6
leisure 9.1
kid 8.9
horse 8.5
expression 8.5
two 8.5
skin 8.5
studio 8.4
vintage 8.3
outdoors 8.2
sensual 8.2
sensuality 8.2
religion 8.1
women 7.9
statue 7.8
mysterious 7.8
youth 7.7
serious 7.6
head 7.6
dark 7.5
senior 7.5
strong 7.5
retro 7.4
blond 7.2

Google
created on 2018-10-18

Microsoft
created on 2018-10-18

person 98.7
standing 98.5
wall 95.3
outdoor 89.6
posing 48.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 57.7%
Sad 23.2%
Disgusted 4%
Angry 8.5%
Happy 0.8%
Confused 7.8%
Surprised 5.9%
Calm 49.7%

AWS Rekognition

Age 10-15
Gender Male, 50.9%
Disgusted 2.4%
Happy 2%
Surprised 1.9%
Sad 11.3%
Calm 47.5%
Angry 23.9%
Confused 10.9%

Microsoft Cognitive Services

Age 19
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Hat 94.3%

Captions

Microsoft

a couple of people that are standing in the rain holding an umbrella 57.5%
a couple of people that are standing in the rain 57.4%
a person standing posing for the camera 57.3%