Human Generated Data

Title

Untitled (children wearing fur hats)

Date

1951

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17828

Human Generated Data

Title

Untitled (children wearing fur hats)

People

Artist: Lucian and Mary Brown, American

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17828

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.9
Human 99.9
Person 99.9
Person 99.5
Person 98.6
Clothing 98.3
Apparel 98.3
Nature 85.8
Outdoors 85.4
Coat 79.5
Face 71.6
Person 70.9
People 66.9
Leisure Activities 66.5
Ice 65.8
Portrait 64.1
Photography 64.1
Photo 64.1
Person 62.9
Stage 60.4
Female 59
Flooring 58.9
Suit 57.6
Overcoat 57.6
Crowd 56.8
Costume 56.7
Plant 56.3

Clarifai
created on 2023-10-29

people 99.9
child 99.6
group 98.5
group together 98
many 97.3
boy 96.7
several 96
wear 94.8
man 94
dancing 93.7
adolescent 93.6
woman 92.8
recreation 91.6
school 91.3
adult 91.2
education 91.1
music 88.6
elementary school 87.9
enjoyment 83.1
outfit 81.8

Imagga
created on 2022-02-26

people 26.2
adult 21.8
person 21.7
city 18.3
man 18.1
clothing 16.6
urban 16.6
fashion 16.6
portrait 16.2
black 15.7
dress 13.5
men 12.9
women 12.6
walking 12.3
teacher 11.3
legs 11.3
style 11.1
street 11
model 10.9
male 10.6
sexy 10.4
historic 10.1
uniform 10.1
world 10
human 9.7
business 9.7
group 9.7
professional 9.3
educator 9.3
smile 9.3
travel 9.1
old 9.1
family 8.9
station 8.7
standing 8.7
scene 8.7
youth 8.5
two 8.5
attractive 8.4
art 8.4
silhouette 8.3
vintage 8.3
back 8.3
pose 8.1
history 8
posing 8
body 8
hair 7.9
indoors 7.9
architecture 7.8
color 7.8
ancient 7.8
pretty 7.7
happy 7.5
traditional 7.5
outfit 7.4
tradition 7.4
girls 7.3
lifestyle 7.2
shop 7.2
interior 7.1

Google
created on 2022-02-26

Gesture 85.3
Window 84.2
Black-and-white 83.5
Font 79.7
Art 76.6
Monochrome photography 72.7
Fun 72.3
Monochrome 72.2
Vintage clothing 70.8
Event 69.2
Photo caption 65.6
Illustration 65.3
Boot 65.1
Stock photography 64.3
Visual arts 64.1
Crew 62.6
Recreation 61.8
Room 61.7
Team 57.3
Hat 57.2

Microsoft
created on 2022-02-26

person 99
text 98.3
clothing 96
window 84
footwear 79.2
man 74.7
posing 35.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 83.7%
Sad 67.9%
Calm 24.5%
Fear 2.2%
Confused 1.7%
Surprised 1.4%
Disgusted 1.2%
Angry 0.8%
Happy 0.4%

AWS Rekognition

Age 10-18
Gender Male, 98.4%
Calm 92.4%
Happy 3.8%
Sad 2.1%
Confused 0.5%
Surprised 0.5%
Angry 0.2%
Disgusted 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.9%
Person 99.9%
Person 99.5%
Person 98.6%
Person 70.9%
Person 62.9%

Categories

Text analysis

Amazon

KODOR-CVEELA