Human Generated Data

Title

Untitled (killed man with onlookers, World War I, France)

Date

1918, printed later

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3570

Human Generated Data

Title

Untitled (killed man with onlookers, World War I, France)

People

Artist: Unidentified Artist,

Date

1918, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3570

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 96.3
Human 96.3
Person 95
Musician 94.3
Musical Instrument 94.3
Person 90.3
Person 86.4
Person 76.6
Leisure Activities 76.1
Person 67
Drummer 55.8
Percussion 55.8
Music Band 55.1

Clarifai
created on 2023-10-25

art 98.9
people 98.8
portrait 98.4
painting 96
one 94.7
war 94.7
adult 93.1
museum 91.2
man 89.4
moon 88.1
monochrome 87.7
light 87.2
picture frame 85.5
music 84.8
landscape 84.6
flame 84.4
woman 83.9
model 83.6
dark 82.2
shadow 82

Imagga
created on 2022-01-08

monitor 85.1
electronic equipment 68.3
equipment 46.8
television 33.4
blackboard 28
broadcasting 21.6
black 19.8
light 18
night 17.7
telecommunication 16.1
sky 15.3
grunge 14.5
dark 14.2
art 12.4
design 11.2
window 11.1
medium 10.7
building 10.4
silhouette 9.9
old 9.7
texture 9.7
architecture 9.4
dirty 9
landscape 8.9
digital 8.9
style 8.9
decoration 8.7
glass 8.6
wallpaper 8.4
evening 8.4
space 7.7
travel 7.7
pattern 7.5
outdoors 7.5
shape 7.5
glow 7.4
computer 7.3
sun 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.7
monitor 99.7
television 99.6
screen 98.2
indoor 86.4
drawing 76.6
ship 75.4
flat 68.4
sketch 67.4
poster 59.8
person 56
black and white 53.3
screenshot 51.2
blackboard 50.2
display 49.5
watching 46.7
image 38.5
set 35.5
picture frame 10.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.7%
Sad 45%
Happy 41.5%
Calm 7.7%
Angry 2%
Confused 1.4%
Disgusted 1.3%
Surprised 0.6%
Fear 0.5%

AWS Rekognition

Age 27-37
Gender Male, 99.5%
Calm 89.4%
Sad 4.8%
Confused 1.4%
Disgusted 1.3%
Angry 0.9%
Fear 0.8%
Surprised 0.7%
Happy 0.7%

AWS Rekognition

Age 7-17
Gender Male, 100%
Angry 57.4%
Calm 28.6%
Surprised 5.1%
Sad 4.2%
Fear 1.8%
Happy 1.2%
Disgusted 0.9%
Confused 0.9%

AWS Rekognition

Age 22-30
Gender Male, 99.9%
Calm 96.2%
Angry 1.4%
Surprised 0.7%
Sad 0.5%
Disgusted 0.5%
Happy 0.4%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Calm 50.6%
Fear 32.8%
Happy 7.5%
Angry 6.6%
Surprised 1.5%
Disgusted 0.5%
Sad 0.4%
Confused 0.1%

AWS Rekognition

Age 19-27
Gender Male, 96.1%
Calm 63.1%
Happy 18%
Angry 6.6%
Sad 3.8%
Fear 2.4%
Surprised 2.2%
Confused 2%
Disgusted 2%

AWS Rekognition

Age 16-24
Gender Male, 99.7%
Sad 64%
Angry 27.2%
Calm 2.8%
Surprised 1.9%
Disgusted 1.6%
Confused 1.5%
Fear 0.6%
Happy 0.4%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 59.4%
Happy 27.2%
Angry 4.4%
Confused 4.4%
Disgusted 2.6%
Sad 0.8%
Surprised 0.8%
Fear 0.4%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Angry 43.1%
Sad 15.2%
Calm 14.7%
Happy 13.2%
Fear 7.4%
Surprised 3.2%
Confused 2.4%
Disgusted 0.8%

AWS Rekognition

Age 27-37
Gender Male, 99.5%
Happy 43.1%
Sad 18.2%
Calm 17.7%
Angry 13%
Confused 3.7%
Fear 1.7%
Disgusted 1.7%
Surprised 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 96.3%

Categories

Imagga

paintings art 100%

Captions