Human Generated Data

Title

Untitled (family outside of C. Bennette Moore Photography Studio)

Date

c. 1930

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21747

Human Generated Data

Title

Untitled (family outside of C. Bennette Moore Photography Studio)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21747

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.6
Human 99.6
Person 99.5
Person 99.2
Person 99
Clothing 98.7
Apparel 98.7
Person 98.6
Person 98
Shorts 94.9
Person 92.9
Pedestrian 92.5
Person 92
Path 82.6
Female 77.6
People 71.4
Pants 67.3
Girl 64.1
Shop 62.2
Overcoat 62.1
Coat 62.1
Kid 61.7
Child 61.7
City 61.2
Building 61.2
Urban 61.2
Town 61.2
Woman 59.6
Road 59.4
Shoe 56
Footwear 56
Tree 55.7
Plant 55.7
Person 53.6

Clarifai
created on 2023-10-22

people 100
group 99.4
group together 99.4
adult 98.6
woman 97
child 97
man 96
many 95.7
several 92.4
monochrome 92
street 91.1
recreation 89.5
administration 88.5
four 87
leader 82.6
five 81.8
three 79
offspring 78.8
music 78.6
canine 76.4

Imagga
created on 2022-03-11

people 31.8
man 23.5
crutch 22.9
adult 21.7
person 21.5
beach 20.2
couple 18.3
staff 17.7
lifestyle 17.3
walking 17
male 15.7
portrait 15.5
attractive 15.4
women 15
sand 14.8
vacation 14.7
street 14.7
travel 14.1
men 13.7
summer 13.5
stick 13.3
shop 13.3
city 13.3
happy 13.2
together 13.1
fashion 12.8
outdoors 12.8
dress 12.6
walk 12.4
urban 12.2
pretty 11.9
life 11.7
outdoor 11.5
world 11.1
love 11
sea 10.9
silhouette 10.8
child 10.6
casual 10.2
leisure 10
ocean 10
transportation 9.9
modern 9.8
sky 9.6
two 9.3
holiday 9.3
transport 9.1
old 9.1
human 9
barbershop 8.9
lady 8.9
style 8.9
group 8.9
sexy 8.8
boy 8.7
sitting 8.6
model 8.6
youth 8.5
relationship 8.4
stall 8.4
black 8.4
alone 8.2
building 8.2
tourist 8.1
sidewalk 8.1
road 8.1
father 8.1
sunset 8.1
water 8
family 8
body 8
business 7.9
parent 7.8
newspaper 7.8
chair 7.7
elegant 7.7
train 7.7
mercantile establishment 7.7
crowd 7.7
running 7.7
relax 7.6
joy 7.5
fun 7.5
window 7.3
sun 7.2
smiling 7.2
looking 7.2
bright 7.1
posing 7.1
happiness 7

Google
created on 2022-03-11

Photograph 94.1
White 92.2
Black 89.9
Black-and-white 86.6
Style 84
Font 80.2
Building 77.3
Monochrome 77
Monochrome photography 76
City 71.7
Road 69.6
Event 69.1
Fun 69
Pedestrian 67.9
Street 65.8
Advertising 64.9
Street fashion 60.4
Facade 57.9
Crowd 56.1
Room 56

Microsoft
created on 2022-03-11

text 99.7
outdoor 98.9
clothing 98.4
footwear 94.4
person 93.4
black and white 88.9
standing 88.1
woman 86.2
people 85
man 84.9
street 82.2
store 41.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 97.7%
Calm 64.9%
Sad 28.9%
Happy 2.7%
Confused 1%
Disgusted 0.8%
Angry 0.6%
Surprised 0.5%
Fear 0.5%

AWS Rekognition

Age 30-40
Gender Male, 99.1%
Calm 97.7%
Confused 1%
Surprised 0.6%
Happy 0.3%
Disgusted 0.2%
Angry 0.1%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 43-51
Gender Male, 98.2%
Happy 82.7%
Calm 11.2%
Surprised 3.2%
Confused 1%
Sad 0.9%
Angry 0.5%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 30-40
Gender Male, 100%
Sad 52.9%
Calm 24.2%
Angry 8.3%
Confused 6.6%
Surprised 3.4%
Disgusted 2.1%
Happy 1.5%
Fear 1%

AWS Rekognition

Age 42-50
Gender Male, 87.2%
Fear 73.7%
Calm 10.8%
Surprised 5.8%
Sad 4.3%
Confused 1.7%
Disgusted 1.5%
Angry 1.4%
Happy 0.7%

AWS Rekognition

Age 45-51
Gender Male, 99.7%
Calm 96.7%
Sad 1.4%
Surprised 0.5%
Confused 0.5%
Angry 0.3%
Happy 0.3%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.6%
Person 99.5%
Person 99.2%
Person 99%
Person 98.6%
Person 98%
Person 92.9%
Person 92%
Person 53.6%
Shoe 56%

Text analysis

Amazon

316
DISTINCTION
l
l Bonnette Mooro
Bonnette
Mooro
PHOTOGRAPHS the DISTINCTION
the
PHOTOGRAPHS

Google

316 OOre Joon YT37A2-XAO
316
OOre
Joon
YT37A2-XAO