Human Generated Data

Title

Untitled (three boys playing ball outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17600

Human Generated Data

Title

Untitled (three boys playing ball outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17600

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.8
Apparel 99.8
Person 99.8
Human 99.8
Person 99.7
Person 99.4
Shorts 98
Female 92.5
Grass 88.5
Plant 88.5
Dress 87.1
Tree 77.9
Face 75.8
Outdoors 72.9
Woman 72.5
People 71.4
Girl 69.4
Photography 68
Photo 68
Portrait 66.3
Kid 64.9
Child 64.9
Suit 59.8
Coat 59.8
Overcoat 59.8
Furniture 58.9
Shoe 58
Footwear 58

Clarifai
created on 2023-10-29

people 99.9
child 99.8
group together 98.1
two 97.6
family 97.1
three 96.8
adult 96.7
group 96.5
boy 95.8
offspring 95.8
woman 94.8
sibling 94.8
son 94.8
four 94.8
recreation 92.5
man 91.9
monochrome 88.2
walk 87.2
administration 86.7
wear 86.5

Imagga
created on 2022-02-26

man 29.6
people 24.5
person 20.9
male 20
outdoors 19.8
adult 19.5
sunset 18.9
silhouette 18.2
child 17.5
outdoor 16.1
beach 15.3
love 15
couple 13.9
sport 13.4
kin 13.3
walking 13.3
vacation 13.1
two 12.7
protection 11.8
weapon 11.8
happiness 11.8
park 11.5
walk 11.4
black 11.4
mask 11.3
portrait 11
danger 10.9
sky 10.8
lifestyle 10.8
recreation 10.8
holding 10.7
dusk 10.5
world 10.4
dark 10
water 10
dirty 10
travel 9.9
radioactive 9.8
romantic 9.8
radiation 9.8
toxic 9.8
protective 9.7
nuclear 9.7
chemical 9.7
gas 9.6
boy 9.6
sea 9.4
evening 9.3
holiday 9.3
ocean 9.1
industrial 9.1
summer 9
stalker 8.9
family 8.9
sun 8.9
destruction 8.8
soldier 8.8
accident 8.8
together 8.8
military 8.7
protect 8.7
clothing 8.6
outside 8.6
girls 8.2
device 8.2
active 8.1
romance 8
women 7.9
gun 7.9
camouflage 7.9
forest 7.8
play 7.8
men 7.7
rifle 7.6
happy 7.5
relationship 7.5
parent 7.5
fun 7.5
leisure 7.5
peaceful 7.3
suit 7.2
autumn 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 99.7
grass 98
clothing 97.7
footwear 94
person 90.3
black and white 79.3
boy 71.6
crowd 0.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 97.5%
Calm 94.5%
Sad 2.2%
Happy 1.1%
Surprised 0.9%
Fear 0.4%
Disgusted 0.4%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 6-16
Gender Female, 69%
Calm 39.5%
Sad 30.2%
Happy 18.7%
Surprised 6.9%
Fear 1.9%
Angry 1.1%
Disgusted 1%
Confused 0.6%

AWS Rekognition

Age 29-39
Gender Female, 78.8%
Calm 99.8%
Happy 0.1%
Sad 0.1%
Surprised 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.8%
Person 99.7%
Person 99.4%

Categories

Text analysis

Amazon

T3S
YODHK-VEELA