Human Generated Data

Title

Milan photograph: Wilmarth in factory yard posing with arms around nine others (Wilmarth second from left, Luigi Crippa third from left, Susan Wilmarth fourth from left), 1973

Date

c. 1973

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Archival Material

Credit Line

Harvard Art Museums/Fogg Museum, The Christopher Wilmarth Archive, Gift of Susan Wilmarth-Rabineau, CW2001.924

Copyright

© Estate of Christopher Wilmarth

Human Generated Data

Title

Milan photograph: Wilmarth in factory yard posing with arms around nine others (Wilmarth second from left, Luigi Crippa third from left, Susan Wilmarth fourth from left), 1973

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

c. 1973

Classification

Archival Material

Credit Line

Harvard Art Museums/Fogg Museum, The Christopher Wilmarth Archive, Gift of Susan Wilmarth-Rabineau, CW2001.924

Copyright

© Estate of Christopher Wilmarth

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.7
Human 99.7
Person 99.7
Person 99.6
Person 99.5
Person 99.4
Person 99
Person 98.4
Person 97.7
Person 92.7
Clothing 91.5
Apparel 91.5
Pants 81.5
Outdoors 79.2
Person 78.5
Face 78.1
Nature 73.6
Tree 70.6
Plant 70.6
Land 69
People 67.6
Meal 67.6
Food 67.6
Jeans 58.3
Denim 58.3
Leisure Activities 57.6

Clarifai
created on 2023-10-26

people 99.8
group 99.1
group together 98.9
war 97
child 96
soldier 94.9
man 93.7
adult 92.3
rifle 91.9
many 90.6
boy 90.2
monochrome 90
campsite 88.9
woman 88.3
skirmish 86.1
gun 85.7
military 84.4
administration 83.3
documentary 82
home 79.4

Imagga
created on 2022-01-16

swing 35.1
mechanical device 28.1
plaything 27.6
snow 27.2
mechanism 20.9
winter 18.7
landscape 17.1
grunge 16.2
old 16
trees 14.2
black 13.8
cold 13.8
vintage 13.2
tree 13.1
house 12.5
dirty 11.7
season 11.7
scene 11.2
chairlift 11.1
frame 10.8
man 10.8
forest 10.4
sky 10.2
kin 10.1
sport 9.9
building 9.9
world 9.8
snowy 9.7
frost 9.6
antique 9.5
light 9.4
outdoors 9.1
park 9.1
ski tow 9
resort area 8.9
urban 8.7
conveyance 8.5
structure 8.5
outdoor 8.4
people 8.4
dark 8.4
color 8.3
texture 8.3
city 8.3
street 8.3
pattern 8.2
weather 8.2
paint 8.1
rural 7.9
wall 7.9
area 7.8
art 7.8
weathered 7.6
retro 7.4
danger 7.3
architecture 7

Google
created on 2022-01-16

Tree 89.1
Mammal 85.6
Adaptation 79.3
Sky 78
Grass 76.5
Plant 76.3
Monochrome 74.7
Monochrome photography 74.6
Vintage clothing 62.2
Wood 61.9
History 60.4
Team 54.5
Room 54.3
Photographic paper 52.9

Microsoft
created on 2022-01-16

outdoor 98.8
person 98.4
clothing 97.7
man 91.3
standing 85.7
black and white 76
group 74.5
text 65.3
people 61
old 47.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 100%
Happy 62.8%
Calm 18.3%
Sad 6.2%
Confused 4.1%
Angry 4%
Surprised 3.1%
Disgusted 0.8%
Fear 0.7%

AWS Rekognition

Age 27-37
Gender Male, 91.1%
Happy 53.5%
Calm 33.2%
Sad 7.1%
Angry 1.9%
Confused 1.7%
Disgusted 1.1%
Surprised 0.9%
Fear 0.5%

AWS Rekognition

Age 21-29
Gender Male, 99.1%
Calm 28.4%
Happy 14%
Angry 12.5%
Confused 11.3%
Sad 10.3%
Surprised 10.1%
Disgusted 7.6%
Fear 5.9%

AWS Rekognition

Age 34-42
Gender Male, 100%
Happy 99%
Calm 0.4%
Angry 0.2%
Surprised 0.1%
Confused 0.1%
Sad 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 45-53
Gender Male, 100%
Surprised 46.5%
Calm 21.5%
Fear 16.5%
Angry 6.7%
Sad 4.6%
Happy 1.8%
Confused 1.4%
Disgusted 1%

AWS Rekognition

Age 40-48
Gender Male, 100%
Happy 49.2%
Calm 23.5%
Surprised 13.4%
Sad 4.1%
Confused 3.3%
Disgusted 3.1%
Angry 2.5%
Fear 0.9%

AWS Rekognition

Age 20-28
Gender Female, 99.7%
Happy 97.6%
Surprised 1%
Calm 0.7%
Fear 0.2%
Sad 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 31-41
Gender Male, 98.9%
Calm 87.7%
Happy 9.7%
Confused 1.3%
Disgusted 0.3%
Angry 0.3%
Surprised 0.2%
Fear 0.2%
Sad 0.2%

AWS Rekognition

Age 29-39
Gender Male, 99.4%
Calm 86%
Sad 4.1%
Happy 3.1%
Confused 2.3%
Fear 2%
Disgusted 1%
Angry 0.9%
Surprised 0.7%

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Happy 63.3%
Disgusted 14.7%
Calm 7.8%
Angry 4.5%
Confused 3.3%
Sad 3%
Surprised 2.1%
Fear 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories