Human Generated Data

Title

Untitled (four girls on bench outside)

Date

c. 1945

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21774

Human Generated Data

Title

Untitled (four girls on bench outside)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21774

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Shorts 99.6
Clothing 99.6
Apparel 99.6
Person 99.5
Human 99.5
Person 97.8
Vegetation 97.5
Plant 97.5
Smile 97.4
Face 97.4
Person 94.9
Outdoors 91.7
Female 88.1
Yard 88
Nature 88
Person 86.8
Tree 85.9
Woodland 84.3
Forest 84.3
Land 84.3
Furniture 80.9
Chair 80.4
People 75.5
Grass 75.2
Meal 74.9
Food 74.9
Swimwear 74.3
Vacation 72.4
Shoe 71.4
Footwear 71.4
Girl 70.2
Portrait 68.2
Photography 68.2
Photo 68.2
Woman 65.3
Kid 61.8
Child 61.8
Pool 58.3
Water 58.3
Park 57.6
Lawn 57.6
Play 56.9
Man 56.8
Statue 55.7
Art 55.7
Sculpture 55.7
Bench 55.2
Bikini 55

Clarifai
created on 2023-10-22

people 99.8
adult 98.6
group 98.5
woman 97.6
child 97.3
group together 97
man 96.7
recreation 96.2
monochrome 95.3
boy 89.3
wear 88.6
many 85.9
girl 85.8
facial expression 84.9
fun 84.8
vehicle 82.8
bench 82.3
three 82
portrait 81.4
administration 80.9

Imagga
created on 2022-03-11

park bench 30.7
bench 28.4
seat 22.2
building 16.4
old 16
architecture 15.6
sculpture 15.4
people 14.5
cemetery 13.8
statue 13.3
pillory 13.3
furniture 13.1
structure 12.5
city 12.5
history 11.6
art 11.4
instrument of punishment 11.3
love 11
man 10.7
tourism 10.7
stone 10.7
house 10.1
outdoor 9.9
landmark 9.9
sepia 9.7
ancient 9.5
culture 9.4
park 9.1
summer 9
fountain 8.7
scene 8.6
instrument 8.6
person 8.5
travel 8.4
black 8.4
monument 8.4
world 8.3
street 8.3
historic 8.2
device 8.2
autumn 7.9
grass 7.9
palace 7.8
couple 7.8
youth 7.7
dark 7.5
outdoors 7.5
famous 7.4
style 7.4
vacation 7.4
uniform 7.4
detail 7.2
portrait 7.1
military uniform 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

tree 99.3
text 98.6
outdoor 98.3
clothing 90.3
person 87.3
smile 81.8
black and white 77.8
footwear 75.5
human face 57.2
posing 44

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 65.1%
Calm 62.5%
Happy 30.9%
Confused 3.4%
Sad 1.2%
Surprised 1%
Disgusted 0.5%
Fear 0.4%
Angry 0.2%

AWS Rekognition

Age 35-43
Gender Female, 99.7%
Happy 88.4%
Calm 7%
Surprised 2.9%
Fear 0.6%
Sad 0.5%
Disgusted 0.4%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 24-34
Gender Female, 76.8%
Calm 67.1%
Happy 18.5%
Surprised 5.9%
Sad 3%
Fear 2.1%
Confused 1.5%
Disgusted 1.2%
Angry 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.5%
Person 97.8%
Person 94.9%
Person 86.8%
Shoe 71.4%

Categories

Text analysis

Amazon

60

Google

YT37A2-XAGO
YT37A2-XAGO