Human Generated Data

Title

Untitled (street scene, New Orleans)

Date

c. 1935

People

Artist: Phyllis Moore Stoll, American active 1940-1969

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1671

Human Generated Data

Title

Untitled (street scene, New Orleans)

People

Artist: Phyllis Moore Stoll, American active 1940-1969

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1671

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Person 99.4
Person 99.1
Person 99
Clothing 97.2
Apparel 97.2
Sitting 83.9
Painting 73.1
Art 73.1
Path 69
Walkway 69
Outdoors 63.8
Overcoat 62.3
Coat 62.3
Hat 59.4
Pants 56
Sleeve 55.2

Clarifai
created on 2023-10-25

people 100
street 99.5
group 99
woman 98.6
child 98.6
adult 98.2
man 97.9
group together 97.8
wear 94.2
two 93
three 92.2
art 91.7
monochrome 91.4
boy 91.2
many 89.4
several 88.2
veil 88.1
cavalry 85.5
four 85.5
lid 84.6

Imagga
created on 2022-01-09

old 22.3
sculpture 19.4
statue 19.1
religion 18.8
religious 15.9
architecture 15.6
ancient 15.6
person 15.2
stone 13.6
vintage 13.2
building 13
world 12.7
temple 12.5
people 12.3
antique 12.2
fashion 12.1
dress 11.7
art 11.1
chair 11
black 10.9
barbershop 10.8
man 10.7
male 10.6
shop 10.5
portrait 10.4
clothing 10.2
history 9.8
adult 9.8
human 9.7
interior 9.7
throne 9.5
travel 9.2
posing 8.9
sexy 8.8
pray 8.7
hair 8.7
grunge 8.5
dirty 8.1
wind instrument 7.9
artistic 7.8
seat 7.8
musical instrument 7.7
chair of state 7.6
monument 7.5
close 7.4
harmonica 7.4
historic 7.3
lady 7.3
sensuality 7.3
carving 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.7
furniture 88
chair 83.4
person 83.3
black and white 81.8
clothing 80.8
old 55.8
several 10.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Female, 99.9%
Sad 97.6%
Calm 1.3%
Confused 0.5%
Fear 0.3%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 18-24
Gender Male, 94.2%
Calm 91.2%
Confused 2.8%
Sad 2.3%
Surprised 1.6%
Fear 0.8%
Disgusted 0.7%
Angry 0.4%
Happy 0.2%

AWS Rekognition

Age 6-16
Gender Male, 100%
Sad 50.6%
Calm 33.7%
Fear 10.6%
Confused 2.9%
Angry 1%
Surprised 0.5%
Happy 0.4%
Disgusted 0.3%

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 35.7%
Sad 31.4%
Fear 16.5%
Confused 5.5%
Angry 4%
Disgusted 2.9%
Surprised 2.8%
Happy 1.3%

Microsoft Cognitive Services

Age 54
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Painting 73.1%

Categories