Human Generated Data

Title

Milan photograph: Wilmarth and three others at opening with "Standing Deeps L.A.," Galleria dell'Ariete, 1973

Date

c. 1973

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Archival Material

Credit Line

Harvard Art Museums/Fogg Museum, The Christopher Wilmarth Archive, Gift of Susan Wilmarth-Rabineau, CW2001.936

Copyright

© Estate of Christopher Wilmarth

Human Generated Data

Title

Milan photograph: Wilmarth and three others at opening with "Standing Deeps L.A.," Galleria dell'Ariete, 1973

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

c. 1973

Classification

Archival Material

Credit Line

Harvard Art Museums/Fogg Museum, The Christopher Wilmarth Archive, Gift of Susan Wilmarth-Rabineau, CW2001.936

Copyright

© Estate of Christopher Wilmarth

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.7
Human 99.7
Person 99.6
Person 99.1
Person 99
Art 97.9
Art Gallery 95.7
Drawing 70.2
Poster 64.7
Advertisement 64.7
Sketch 63.6
Collage 62.3
Floor 62
Silhouette 57.6
Flooring 56.4

Clarifai
created on 2023-10-26

woman 98.9
portrait 98.8
people 98.8
girl 98.3
man 97.9
art 97
adult 96.6
window 96
indoors 93.8
museum 92.9
one 92.7
model 92.1
exhibition 92
telephone 90.9
business 90.5
painting 89.4
reflection 89
two 88.6
wedding 88.5
monochrome 88.3

Imagga
created on 2022-01-16

telephone 21
business 19.4
silhouette 19
device 18.8
call 18.6
window 18.4
man 17.5
people 17.3
person 15.6
office 15.1
black 14.6
electronic equipment 14.6
male 14.3
pay-phone 13.8
adult 13.6
businessman 13.2
wall 13
elevator 12.7
equipment 12.4
room 11.6
portrait 11.6
light 11.4
spectator 10.4
men 10.3
door 10.3
work 10.2
interior 9.7
working 9.7
building 9.7
women 9.5
corporate 9.4
lifting device 9.4
alone 9.1
architecture 8.6
happy 8.1
computer 8
body 8
smile 7.8
hands 7.8
face 7.8
child 7.8
sitting 7.7
old 7.7
communication 7.6
vintage 7.5
house 7.5
dark 7.5
executive 7.4
lady 7.3
laptop 7.3
figure 7.2
dial telephone 7.2
modern 7

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

wall 96.5
person 96.2
text 93
art 92
gallery 91.2
man 90.4
black and white 88.2
clothing 84.1
scene 83
room 80.4
suit 59.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 100%
Calm 66.4%
Confused 12.7%
Angry 7.9%
Happy 2.9%
Surprised 2.9%
Sad 2.9%
Fear 2.2%
Disgusted 2%

AWS Rekognition

Age 18-24
Gender Female, 56.5%
Calm 55.2%
Disgusted 12.7%
Surprised 11.2%
Sad 9%
Confused 4.8%
Fear 4.7%
Happy 1.5%
Angry 1%

AWS Rekognition

Age 22-30
Gender Female, 99.9%
Calm 96.2%
Fear 2%
Disgusted 0.5%
Confused 0.4%
Angry 0.4%
Surprised 0.3%
Sad 0.2%
Happy 0.1%

AWS Rekognition

Age 14-22
Gender Male, 99.4%
Calm 85.4%
Sad 13.4%
Angry 0.3%
Fear 0.3%
Confused 0.3%
Disgusted 0.2%
Surprised 0.2%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories