Human Generated Data

Title

Untitled (young man and kids sitting on ground, woman standing)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16351

Human Generated Data

Title

Untitled (young man and kids sitting on ground, woman standing)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16351

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Clothing 99.3
Apparel 99.3
Person 99.1
Human 99.1
Person 98.5
Person 93.7
Outdoors 92.2
Nature 92
Person 89.5
Person 89.5
Person 84.4
Person 74.7
Countryside 72.9
Hat 71.5
People 67.1
Portrait 65.6
Photography 65.6
Face 65.6
Photo 65.6
Helmet 62.3
Overcoat 59.5
Coat 59.5
Rural 59.2
Grass 57.9
Plant 57.9
Suit 57.2
Military Uniform 56.9
Military 56.9
Person 48.7

Clarifai
created on 2023-10-28

people 99.9
group together 99.2
group 98.9
child 97.8
man 96.7
adult 95.6
several 93.5
wear 92.7
three 92.1
boy 91.1
many 90.5
soldier 89
four 88.7
war 88.1
outfit 87.9
military 87.4
administration 87
recreation 85.5
leader 84.9
sibling 84.8

Imagga
created on 2022-02-11

brass 100
wind instrument 85
musical instrument 62.5
landscape 22.3
snow 20.6
man 17.5
outdoors 17.2
outdoor 16.8
shovel 16.3
winter 16.2
old 15.3
male 14.9
sky 14.7
cold 14.6
tree 14.6
beach 14.3
people 13.4
water 13.3
sand 13.1
summer 12.9
field 12.5
vacation 12.3
rural 11.5
outside 11.1
adult 11
holiday 10.7
tool 10.6
forest 10.4
relax 10.1
park 9.9
trees 9.8
lifestyle 9.4
sea 9.4
season 9.3
travel 9.1
scenery 9
black 9
country 8.8
hand tool 8.8
snowy 8.7
scene 8.7
person 8.5
weather 8.3
work 8.2
sunset 8.1
scenic 7.9
portrait 7.8
frost 7.7
building 7.6
sport 7.5
ocean 7.5
vintage 7.4
countryside 7.3
sun 7.2
active 7.2
day 7.1
happiness 7

Microsoft
created on 2022-02-11

outdoor 96.7
text 77.5
clothing 68.4
black and white 65.3
person 57.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 72.5%
Happy 62.4%
Surprised 19.5%
Calm 15.4%
Sad 0.7%
Confused 0.7%
Disgusted 0.6%
Angry 0.4%
Fear 0.2%

Feature analysis

Amazon

Person
Helmet
Person 99.1%

Categories

Imagga

paintings art 98%
beaches seaside 1.1%

Text analysis

Amazon

21.
rap
KODVK-SVEELX

Google

21.
21.