Human Generated Data

Title

Aaron Siskind, Kentucky

Date

1970

People

Artist: Carl Chiarenza, American born 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Carl Chiarenza and Heidi Katz in memory of Charles and Mary Chiarenza, P2003.90

Copyright

© 1970 Carl Chiarenza

Human Generated Data

Title

Aaron Siskind, Kentucky

People

Artist: Carl Chiarenza, American born 1935

Date

1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Carl Chiarenza and Heidi Katz in memory of Charles and Mary Chiarenza, P2003.90

Copyright

© 1970 Carl Chiarenza

Machine Generated Data

Tags

Amazon
created on 2022-06-03

Person 99.8
Human 99.8
Photographer 90.8
Tripod 90.4
Photography 86
Photo 86
Glasses 79.8
Accessories 79.8
Accessory 79.8
Bird 68.1
Animal 68.1
Person 60.2

Clarifai
created on 2023-10-30

people 99.5
two 98.2
one 98.2
wear 97.3
art 96
adult 95.1
man 94.9
administration 92.1
retro 91.6
group 89.6
portrait 88.9
exploration 87.1
vintage 85.4
three 85.3
military 84.1
journalist 84
war 82.5
old 82.4
antique 81.8
movie 81.2

Imagga
created on 2022-06-03

man 31.6
tripod 27.7
male 26.9
photographer 26.3
equipment 23
weapon 22.6
camera 20.1
rack 18.9
person 17.9
work 15.7
soldier 15.6
support 15.6
portrait 15.5
adult 15.5
military 15.4
mask 14.9
people 14.5
professional 14.3
instrument 14
gun 13.9
industry 13.6
lens 12.8
black 12.6
arrow 12.2
hand 12.1
metal 12.1
engineer 12
uniform 11.7
war 11.6
device 11.5
tool 11.5
technology 11.1
shoot 10.6
retro 10.6
projectile 10.4
industrial 10
music 9.9
studio 9.9
microphone 9.8
army 9.7
digital 9.7
men 9.4
safety 9.2
entertainment 9.2
vintage 9.1
human 9
worker 8.9
job 8.8
zoom 8.8
wood 8.3
focus 8.3
holding 8.2
television camera 8.1
painter 8.1
camouflage 7.8
shooting 7.8
warrior 7.8
gas 7.7
old 7.7
engineering 7.6
machine 7.3
occupation 7.3
protection 7.3
danger 7.3
clothing 7.2
handsome 7.1
smile 7.1
working 7.1

Microsoft
created on 2022-06-03

person 97.3
man 94.6
text 89.3
camera 86
black and white 82.8
tripod 75.4
clothing 69.2
weapon 52.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-62
Gender Male, 100%
Confused 93%
Surprised 6.6%
Fear 6.1%
Calm 2.8%
Sad 2.3%
Disgusted 1.5%
Angry 0.7%
Happy 0.2%

Microsoft Cognitive Services

Age 68
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Glasses 79.8%
Bird 68.1%

Categories

Imagga

paintings art 75.6%
people portraits 22.8%

Text analysis

Amazon

I
- I
-