Human Generated Data

Title

Untitled (portrait of two children reading)

Date

c. 1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1970

Human Generated Data

Title

Untitled (portrait of two children reading)

People

Artist: John Deusing, American active 1940s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1970

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.2
Human 99.2
Person 98
Nature 70.2
Outdoors 67.6
Housing 66.6
Building 66.6
Clothing 63.8
Apparel 63.8
Kid 63.7
Child 63.7
Photography 63.2
Photo 63.2
Worker 61.4
Hug 60.7
Girl 60.4
Female 60.4
Door 60
Flooring 56.3
Hairdresser 55.7

Clarifai
created on 2023-10-26

people 99.5
child 98.9
monochrome 97.9
two 96.2
baby 94
man 93.8
son 93.2
family 92.7
adult 92.5
indoors 91.7
portrait 90.7
one 90.3
woman 89.2
boy 86
group 85.4
wear 84.8
black and white 83.7
room 83.3
girl 82
love 81.2

Imagga
created on 2022-01-22

vessel 42.5
bathtub 42.5
people 28.4
person 28.4
adult 28
tub 26.4
bathroom 25.2
portrait 24.6
lifestyle 23.9
happy 23.8
child 22.8
bath 21.8
pretty 21.7
home 21.5
male 21.4
man 20.8
face 20.6
care 19.8
sexy 19.3
cute 18.7
hair 18.2
attractive 18.2
laptop 17.5
boy 17.4
health 17.4
smiling 17.4
smile 17.1
kid 16.8
little 16.8
clean 16.7
childhood 16.1
relaxation 15.9
water 15.3
indoors 14.9
couple 14.8
sitting 14.6
computer 14.5
skin 14.4
body 14.4
relax 14.3
happiness 14.1
room 13.9
lady 13
youth 12.8
casual 12.7
one 12.7
women 12.7
love 12.6
spa 12.6
wet 12.5
holding 12.4
hygiene 12.3
blond 12
human 12
fun 12
healthy 12
baby 11.5
technology 11.1
adorable 11.1
sensual 10.9
washing 10.7
soap 10.7
foam 10.7
cheerful 10.6
groom 10.3
nice 10.1
alone 10
house 10
children 10
leisure 10
working 9.7
looking 9.6
erotic 9.5
bubble 9.4
two 9.3
mature 9.3
joy 9.2
table 9.1
innocent 8.9
infant 8.7
wash 8.7
work 8.6
toddler 8.6
husband 8.6
wife 8.5
kids 8.5
hand 8.4
relaxing 8.2
life 8
handsome 8
shower 8
interior 8
look 7.9
color 7.8
expression 7.7
loving 7.6
head 7.6
communication 7.6
fashion 7.5
senior 7.5
treatment 7.3
office 7.3
indoor 7.3
playing 7.3
business 7.3
sensuality 7.3
negative 7.2
professional 7.1
family 7.1
mother 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 98.2
man 90.1
black and white 86.9
clothing 71.3
text 64.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 12-20
Gender Male, 78.1%
Calm 67.2%
Disgusted 12.4%
Surprised 4.6%
Happy 4.5%
Confused 3.4%
Fear 3.2%
Angry 2.7%
Sad 1.9%

AWS Rekognition

Age 7-17
Gender Female, 82.1%
Angry 55.3%
Calm 33.1%
Surprised 3%
Sad 2.9%
Disgusted 2.8%
Happy 1.8%
Confused 0.7%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

pets animals 99.4%

Captions

Microsoft
created on 2022-01-22

a person standing in a room 70.7%
a man and a woman standing in a room 53%