Human Generated Data

Title

Untitled (man doing headstand on grass, children watching)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17724

Human Generated Data

Title

Untitled (man doing headstand on grass, children watching)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17724

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.7
Apparel 99.7
Person 99.5
Human 99.5
Person 98.9
Person 97.6
Grass 94.3
Plant 94.3
Dress 93.3
Female 87.4
Shorts 84.6
Skirt 76.8
Outdoors 69.3
Girl 67.9
Woman 66.4
People 65.8
Portrait 63
Photography 63
Face 63
Photo 63
Kid 61.6
Child 61.6
Tree 56.1

Clarifai
created on 2023-10-29

people 99.9
child 99.3
two 97.9
adult 97.2
group 95.4
man 94.8
one 92.1
group together 92
boy 91
wear 90.4
son 89.6
three 88.9
woman 87.8
recreation 87.3
monochrome 86.7
fun 83.6
outfit 83.3
several 83.1
baby 80.9
girl 79.9

Imagga
created on 2022-02-26

musical instrument 27.3
accordion 26.3
parasol 23.6
man 23.5
sport 22.5
people 21.8
keyboard instrument 21
outdoor 16.8
newspaper 16.7
wind instrument 16.6
adult 16.2
summer 16.1
beach 15.8
sand 15.6
person 15.5
male 14.9
outdoors 14.2
sunset 13.5
product 13.3
umbrella 13.2
sky 12.1
field 11.7
silhouette 10.8
sun 10.5
couple 10.5
tool 10.3
creation 10.3
grass 10.3
black 10.2
relax 10.1
joy 10
danger 10
leisure 10
canopy 9.9
park 9.9
child 9.6
play 9.5
love 9.5
men 9.4
happy 9.4
freedom 9.1
horse 9
active 9
activity 9
run 8.7
animal 8.7
day 8.6
happiness 8.6
portrait 8.4
athlete 8.4
dark 8.3
ocean 8.3
rake 8.3
fun 8.2
protection 8.2
dirty 8.1
lifestyle 8
rural 7.9
sea 7.8
work 7.6
dangerous 7.6
walking 7.6
player 7.4
shelter 7.4
competition 7.3
recreation 7.2
equipment 7.1
working 7.1
travel 7
country 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

black and white 78
text 69.3
dance 54.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 81.1%
Calm 48.7%
Sad 36.3%
Confused 5.9%
Disgusted 3.2%
Happy 2%
Angry 1.9%
Surprised 1.3%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 98.9%
Person 97.6%

Captions

Microsoft
created on 2022-02-26

a person holding a dog 62.1%
a person standing next to a dog 62%
a person with a dog 61.9%

Text analysis

Amazon

7
KODAK-
AID

Google

NAGON
NAGON