Human Generated Data

Title

Untitled (man posed looking at baby in stroller on sidewalk)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9255

Human Generated Data

Title

Untitled (man posed looking at baby in stroller on sidewalk)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9255

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.8
Human 98.8
Person 96.1
Vehicle 88.9
Transportation 88.9
Clothing 87.1
Apparel 87.1
Face 76.4
Tricycle 74.3
Portrait 70.2
Photography 70.2
Photo 70.2
Plant 63.9
Sitting 63.8
Kid 63.4
Child 63.4
Wheel 61.3
Machine 61.3
Chair 59
Furniture 59
Tree 58.7
Grass 58.2
Dress 57.4

Clarifai
created on 2023-10-26

people 99.9
two 97.1
monochrome 97
adult 96.8
child 96.6
chair 96.2
sit 94.7
man 94
nostalgia 93.9
family 93.3
wear 91.2
vehicle 90.3
seat 89.5
portrait 88.2
furniture 87.3
offspring 86.8
home 85.1
three 84.4
one 84
sitting 83.4

Imagga
created on 2022-01-23

bench 33.3
park bench 29.5
people 29
seat 25
park 23.9
wheeled vehicle 23.5
outdoors 22.5
tricycle 22.4
man 22.2
mother 22.1
parent 21.7
adult 21.4
portrait 21.4
couple 20
love 19.7
happy 18.8
family 18.7
male 18.6
person 17.7
happiness 17.2
vehicle 17.1
married 16.3
father 15.9
furniture 15.5
outdoor 15.3
dad 15.1
bride 14.4
women 14.2
child 13.7
dress 13.6
day 13.3
together 13.1
smiling 13
wedding 12.9
wheelchair 12
old 11.8
conveyance 11.8
groom 11.7
tree 11.5
smile 11.4
home 11.2
men 11.2
two 11
lifestyle 10.8
traditional 10.8
daughter 10.7
kin 10.7
attractive 10.5
senior 10.3
celebration 9.6
sitting 9.4
youth 9.4
joy 9.2
teenager 9.1
summer 9
cheerful 8.9
romantic 8.9
chair 8.8
autumn 8.8
husband 8.6
outside 8.6
marriage 8.5
culture 8.5
face 8.5
winter 8.5
black 8.4
clothing 8.4
fashion 8.3
girls 8.2
romance 8
sepia 7.8
casual 7.6
statue 7.6
wife 7.6
furnishing 7.6
togetherness 7.6
nurse 7.5
human 7.5
city 7.5
mature 7.4
teen 7.3
building 7.3
lady 7.3
cute 7.2
travel 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

outdoor 99.1
text 99
person 97.6
man 97.4
clothing 89.3
black and white 83.8
baby 83.3
human face 75.6
toddler 74.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 65.9%
Happy 86.1%
Surprised 6.2%
Calm 4.8%
Disgusted 0.8%
Confused 0.7%
Angry 0.6%
Sad 0.4%
Fear 0.4%

AWS Rekognition

Age 7-17
Gender Female, 72.8%
Calm 79.9%
Happy 7.8%
Sad 4.3%
Surprised 3%
Fear 1.7%
Angry 1.4%
Confused 0.9%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Categories

Imagga

paintings art 99.2%

Text analysis

Amazon

st
st st