Human Generated Data

Title

Untitled (man giving speech in rain)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16843

Human Generated Data

Title

Untitled (man giving speech in rain)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16843

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.5
Apparel 99.5
Person 98.9
Human 98.9
Water 96
Person 94.8
Overcoat 93.1
Person 88.4
Nature 86.5
Outdoors 76
Floor 67.8
Suit 65.4
Pool 63.7
Coat 57.7

Clarifai
created on 2023-10-28

people 99.8
group together 98
woman 96.6
adult 96
monochrome 94.8
child 94.5
wear 93.9
group 92.4
two 91.7
man 91
street 89.6
water 87
administration 86.4
recreation 85
rain 79.6
several 78.1
reflection 76.5
boy 76.5
offspring 76.3
three 75.5

Imagga
created on 2022-02-26

water 34.7
kin 33.4
ocean 27
beach 26.5
sea 25.8
people 25.1
lake 24
sunset 23.4
man 19.6
outdoors 18.8
summer 18
travel 16.2
city 15.8
male 15.8
silhouette 15.7
sun 14.7
person 14.6
wet 14.3
leisure 14.1
sky 14
reflection 13.9
sport 13.3
walking 13.3
child 13.1
couple 13.1
sand 12.7
landscape 12.6
coast 12.6
recreation 12.6
fishing 12.5
river 12.5
sidewalk 12.3
active 11.7
tourism 11.6
outdoor 11.5
fisherman 11.4
boy 11.3
evening 11.2
world 11.1
tourist 11
urban 10.5
fun 10.5
waves 10.2
lifestyle 10.1
activity 9.9
vacation 9.8
standing 9.6
walk 9.5
outside 9.4
happy 9.4
light 9.4
relax 9.3
pier 8.9
wave 8.6
shore 8.5
sunrise 8.4
boat 8.4
old 8.4
life 8.3
street 8.3
calm 8.2
romantic 8
sunlight 8
adult 8
trainer 8
love 7.9
holiday 7.9
happiness 7.8
men 7.7
pond 7.7
winter 7.7
hobby 7.6
joy 7.5
black 7.4
action 7.4
building 7.4
alone 7.3
color 7.2
architecture 7.2
day 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

water 98.7
text 98
clothing 94.9
outdoor 94.7
person 89.2
standing 77.9
black and white 73
man 72.7
lake 68.9
old 48

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 99.7%
Happy 73.3%
Calm 12.1%
Fear 9.6%
Surprised 2.4%
Sad 1%
Confused 0.7%
Angry 0.6%
Disgusted 0.5%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 98.9%
Happy 0.3%
Surprised 0.3%
Sad 0.2%
Confused 0.2%
Disgusted 0.1%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person
Coat
Person 98.9%
Person 94.8%
Person 88.4%
Coat 57.7%

Categories

Text analysis

Amazon

90

Google

90
90