Human Generated Data

Title

Untitled (kids sitting on grass with dog)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16349

Human Generated Data

Title

Untitled (kids sitting on grass with dog)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16349

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.3
Human 99.3
Clothing 98.9
Apparel 98.9
Person 98.2
Shorts 93.8
Person 93.1
Play 91.7
Grass 88.4
Plant 88.4
Shoe 85.8
Footwear 85.8
Pants 84.7
Face 83.3
Ground 82.2
Outdoors 80.1
Female 79.3
Kid 77.9
Child 77.9
People 77.2
Boy 73.7
Portrait 71
Photography 71
Photo 71
Girl 69.7
Person 68
Person 66.1
Sand 63.8
Nature 63.8
Helmet 60.8
Tree 59
Person 58.7
Baby 58.5
Dress 56.9
Shoe 56.2

Clarifai
created on 2023-10-28

people 99.9
child 99.9
group together 97.6
boy 97.3
monochrome 96.6
man 95.5
two 95.3
fun 95
enjoyment 94.3
recreation 93.4
three 92.6
wear 92.4
son 91.2
athlete 91.1
baseball 90.3
four 89.9
outfit 89.9
sibling 88.7
sports equipment 88.4
adult 87.7

Imagga
created on 2022-02-11

man 24.9
person 24
people 24
child 20.7
adult 19.4
male 18.6
outdoor 18.3
sport 17.6
athlete 17.5
outdoors 16.4
beach 16.1
player 15.7
lifestyle 14.4
ballplayer 14.4
portrait 14.2
summer 13.5
active 13.5
happiness 13.3
black 13.2
happy 13.2
swing 12.3
contestant 12.2
outside 12
equipment 11.7
fun 11.2
play 11.2
mechanical device 11
sand 10.6
sea 10.2
leisure 10
old 9.8
health 9.7
women 9.5
men 9.4
youth 9.4
joy 9.2
plaything 9.2
exercise 9.1
vacation 9
recreation 9
sexy 8.8
boy 8.7
sports equipment 8.6
walk 8.6
world 8.5
brass 8.2
park 8.2
playing 8.2
danger 8.2
mechanism 8.2
sunset 8.1
activity 8.1
love 7.9
mask 7.8
ball 7.6
two 7.6
wicket 7.6
relax 7.6
walking 7.6
field 7.5
free 7.5
newspaper 7.4
water 7.3
juvenile 7.3
wind instrument 7.3
protection 7.3
smiling 7.2
holiday 7.2
cricket equipment 7.2
smile 7.1
grass 7.1
family 7.1
businessman 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

outdoor 99.5
clothing 94.8
footwear 93.6
person 88.4
toddler 81.8
black and white 73.2
text 70.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 83.4%
Calm 99.2%
Sad 0.3%
Happy 0.3%
Surprised 0.1%
Disgusted 0.1%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 97.5%
Calm 59.1%
Happy 24.3%
Sad 5.8%
Surprised 5.5%
Angry 1.8%
Disgusted 1.2%
Confused 1.1%
Fear 1.1%

AWS Rekognition

Age 20-28
Gender Female, 99.5%
Fear 63.2%
Calm 24.9%
Sad 6.8%
Happy 2.9%
Angry 1.1%
Surprised 0.5%
Disgusted 0.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Helmet
Person 99.3%
Person 98.2%
Person 93.1%
Person 68%
Person 66.1%
Person 58.7%
Shoe 85.8%
Shoe 56.2%
Helmet 60.8%

Categories

Imagga

paintings art 95.2%
beaches seaside 3.7%

Text analysis

Amazon

15.
KODVKSA
ГАД

Google

15. YT33A2 AGO
15.
YT33A2
AGO