Human Generated Data

Title

Untitled (group of children on couch)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17349

Human Generated Data

Title

Untitled (group of children on couch)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17349

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Chair 99.4
Furniture 99.4
Person 99.1
Person 99
Person 98.8
Clothing 96.7
Apparel 96.7
Person 96.3
Dress 95.9
Shoe 95.8
Footwear 95.8
Female 94.9
Shoe 93.9
Face 93.8
Blonde 93.6
Kid 93.6
Girl 93.6
Woman 93.6
Teen 93.6
Child 93.6
Play 93
Baby 91.7
Floor 90
Couch 87.2
Indoors 85.4
Housing 85.3
Building 85.3
Shorts 78.6
Room 78.3
Sitting 77.5
Smile 77.2
Door 76
Boy 74.6
Portrait 74.1
Photography 74.1
Photo 74.1
Suit 68.5
Coat 68.5
Overcoat 68.5
Living Room 64.7
Outdoors 63.9
Flooring 63.4
People 62.7
Window 59.1
Shoe 51.8

Clarifai
created on 2023-10-29

people 99.9
child 99.8
group 99.5
group together 98.2
family 96.6
sibling 96.2
room 96.1
son 95.6
offspring 94.6
four 94.4
man 93.9
three 93.6
adult 93.3
several 93
education 91.9
boy 91.7
woman 91.4
sit 91
two 91
indoors 88.7

Imagga
created on 2022-02-26

kin 40.3
man 28.2
people 26.8
child 25
person 22.4
adult 22.4
smiling 21.7
male 21.6
home 20.7
happy 20
parent 19.4
couple 19.2
family 18.7
room 18.4
indoors 17.6
lifestyle 17.3
mother 17.1
women 16.6
dad 15.8
sitting 15.5
interior 15
love 15
happiness 14.9
portrait 14.9
father 14.4
smile 14.2
together 14
cheerful 13
men 12.9
playing 12.8
fun 12.7
outdoors 11.9
two 11.9
boy 10.4
senior 10.3
life 10.1
teacher 10.1
cute 10
holding 9.9
world 9.8
couch 9.7
patient 9.6
sibling 9.5
togetherness 9.4
youth 9.4
pretty 9.1
hospital 9
childhood 9
chair 8.9
school 8.9
bride 8.6
wife 8.5
casual 8.5
old 8.4
back 8.3
human 8.2
care 8.2
indoor 8.2
one 8.2
dress 8.1
kid 8
to 8
holiday 7.9
day 7.8
married 7.7
health 7.6
husband 7.6
loving 7.6
enjoying 7.6
leisure 7.5
mature 7.4
park 7.4
wedding 7.4
grandma 7.3
lady 7.3
girls 7.3
children 7.3
looking 7.2
classroom 7.1
face 7.1

Google
created on 2022-02-26

Window 93
Black-and-white 85.2
Shelf 84.7
Style 83.9
Building 83.8
Monochrome 76.1
Art 75.9
Monochrome photography 75.6
Bookcase 73.7
House 73.5
Room 68.1
Toddler 66.7
Curtain 64.3
Sitting 63.9
Fun 63.7
Stock photography 63.6
Visual arts 63.1
Child 62.4
Flooring 61.2
Illustration 57.7

Microsoft
created on 2022-02-26

toddler 93.8
text 92.9
clothing 89.7
house 85.4
window 83.3
person 82.3
black and white 78.6
baby 77.6
child 71.1
human face 59.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 89.3%
Happy 77.3%
Calm 12.1%
Surprised 5.8%
Sad 1.8%
Fear 1.5%
Disgusted 0.6%
Angry 0.6%
Confused 0.3%

AWS Rekognition

Age 38-46
Gender Male, 100%
Calm 49.9%
Surprised 48.8%
Sad 0.3%
Angry 0.3%
Confused 0.2%
Fear 0.2%
Happy 0.2%
Disgusted 0.2%

AWS Rekognition

Age 23-33
Gender Male, 62.2%
Calm 95.5%
Sad 1.9%
Angry 1.1%
Surprised 0.7%
Happy 0.3%
Disgusted 0.2%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 34-42
Gender Male, 98.3%
Calm 43.4%
Happy 40.4%
Surprised 5.3%
Angry 3.6%
Sad 3.6%
Confused 2%
Disgusted 1.2%
Fear 0.5%

AWS Rekognition

Age 27-37
Gender Female, 90.9%
Angry 43.4%
Fear 14.5%
Calm 13.1%
Happy 11.7%
Surprised 7.4%
Sad 7.2%
Confused 1.6%
Disgusted 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.7%
Person 99.1%
Person 99%
Person 98.8%
Person 96.3%
Shoe 95.8%
Shoe 93.9%
Shoe 51.8%

Categories

Text analysis

Amazon

a
YT33A2
MON YT33A2
MON

Google

MMA YT3RA2 002UA
MMA
YT3RA2
002UA