Human Generated Data

Title

Untitled (women seated with children and dog on rug over stone steps outdoors)

Date

c. 1940

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12668

Human Generated Data

Title

Untitled (women seated with children and dog on rug over stone steps outdoors)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12668

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 98.4
Human 98.4
Person 97.7
Chair 97.1
Furniture 97.1
Clothing 95.5
Apparel 95.5
Nature 93.9
Outdoors 93
Female 89.6
Face 88.9
Dress 88
Tree 87.5
Plant 87.5
Water 86.4
Vegetation 85.3
Person 75
Woman 73.8
Person 72.9
Pants 72.9
Kid 72
Child 72
Grass 71.5
Girl 69.7
Person 68.9
Yard 67.7
Portrait 67.4
Photography 67.4
Photo 67.4
Weather 59
Park 57.4
Lawn 57.4
Sitting 56.4
Teen 55.3
Baby 55.2
Path 55

Clarifai
created on 2023-10-29

people 99.7
monochrome 97.6
man 97
art 96.4
adult 96.3
child 95.8
sit 95
street 94.3
seat 93.5
woman 93.2
bench 93
furniture 90.5
two 88.7
print 87.6
vintage 87.6
group 87.4
chair 87
portrait 86.7
sitting 86.2
boy 83.4

Imagga
created on 2022-02-04

swing 78.8
mechanical device 64.9
plaything 63.7
mechanism 48.5
man 18.1
male 17
wall 15.4
dirty 15.4
grunge 15.3
silhouette 14.9
person 14.8
fountain 14.7
people 14.5
black 14.4
cello 13.8
adult 13.6
old 13.2
bowed stringed instrument 13
human 12
style 11.9
structure 11.8
dark 11.7
cool 11.5
stringed instrument 11.5
sexy 10.4
women 10.3
water 10
outdoor 9.9
attractive 9.8
posing 9.8
couple 9.6
musical instrument 9.2
alone 9.1
danger 9.1
sensuality 9.1
sport 9.1
portrait 9.1
sunset 9
one 9
lady 8.9
building 8.9
body 8.8
light 8.7
barrow 8.7
men 8.6
bench 8.5
fashion 8.3
dress 8.1
fitness 8.1
urban 7.9
stretching 7.8
balance 7.6
relaxation 7.5
vintage 7.4
action 7.4
vessel 7.4
teenager 7.3
exercise 7.3
active 7.2
art 7.2
love 7.1
modern 7

Microsoft
created on 2022-02-04

text 99.9
person 78.8
black and white 69.5
drawing 51.4
painting 15.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 97%
Calm 78%
Happy 18.5%
Surprised 1.4%
Confused 0.5%
Sad 0.5%
Disgusted 0.4%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Male, 94.2%
Happy 50.7%
Calm 35%
Sad 7%
Surprised 3.9%
Angry 1.2%
Disgusted 0.8%
Confused 0.8%
Fear 0.8%

AWS Rekognition

Age 37-45
Gender Male, 60.4%
Calm 92.2%
Happy 4.7%
Sad 2.3%
Disgusted 0.2%
Surprised 0.2%
Angry 0.2%
Fear 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.4%
Person 97.7%
Person 75%
Person 72.9%
Person 68.9%

Categories

Captions

Microsoft
created on 2022-02-04

an old photo of a person 67.9%
an old photo of a person 67.8%
old photo of a person 63.3%