Human Generated Data

Title

Chicago

Date

1960, printed later

People

Artist: Harry Callahan, American 1912 - 1999

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara and Gene Polk, 2006.306

Copyright

© Estate of Harry Callahan, Courtesy of Pace Gallery

Human Generated Data

Title

Chicago

People

Artist: Harry Callahan, American 1912 - 1999

Date

1960, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara and Gene Polk, 2006.306

Copyright

© Estate of Harry Callahan, Courtesy of Pace Gallery

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Face 99.9
Head 99.9
Photography 99.9
Portrait 99.9
Person 98.1
Adult 98.1
Male 98.1
Man 98.1
Person 97.9
Adult 97.9
Female 97.9
Woman 97.9
People 97.3
Body Part 92.3
Finger 92.3
Hand 92.3
Indoors 64.2
Art 57.4
Painting 57.4
Silhouette 56.4
Neck 55.9
Sitting 55.5
Blackboard 55.4
Outdoors 55

Clarifai
created on 2018-08-23

people 99.2
man 96.8
one 96.4
adult 95.2
business 94.6
empty 93.9
display 92.8
blank 91
room 90.9
indoors 89.7
bill 88.5
portrait 88.2
billboard 87.2
two 85.5
side view 84.6
wall 83.9
child 82.8
shadow 82
monochrome 80.7
boy 80.5

Imagga
created on 2018-08-23

binder 85.2
protective covering 68.2
covering 50.9
box 25.9
container 21.2
black 19.9
chest 19.7
bag 19.3
business 18.8
metal 16.9
old 16
office 15.2
computer 14.4
briefcase 13.9
laptop 13.8
equipment 13.6
object 11.7
wooden 11.4
man 10.7
vintage 10.7
silver 10.6
suitcase 9.8
businessman 9.7
education 9.5
leather 9.5
case 9.4
male 9.2
notebook 9.2
retro 9
technology 8.9
brown 8.8
antique 8.6
device 8.6
close 8.5
professional 8.4
adult 8.4
wood 8.3
security 8.3
gold 8.2
style 8.1
open 8.1
interior 7.9
people 7.8
blank 7.7
storage 7.6
clothing 7.6
desk 7.5
symbol 7.4
single 7.4
decoration 7.3
lifestyle 7.2

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

wall 99.1
indoor 89
screen 73.7
electronics 67.4
gallery 62.9
display 33.3
picture frame 15.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 98.1%
Calm 88.8%
Surprised 6.3%
Sad 6%
Fear 6%
Confused 2.5%
Angry 0.5%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Male, 98.7%
Calm 95%
Surprised 6.3%
Fear 5.9%
Sad 3.9%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%
Adult 98.1%
Male 98.1%
Man 98.1%
Female 97.9%
Woman 97.9%