Human Generated Data

Title

Untitled (Dr. Herman M. Juergens: walking outside; looking through microscope)

Date

1964-67

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.189

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens: walking outside; looking through microscope)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1964-67

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.189

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Art 96.6
Collage 96.6
Baby 96.4
Person 96.4
Person 96.2
Painting 90
Computer Hardware 85.6
Electronics 85.6
Hardware 85.6
Monitor 85.6
Screen 85.6
Head 76.7
Monitor 76.5
Face 74.9
Person 74.6
Animal 62.2
Mammal 60.1
Silhouette 57.3
Indoors 56.3
Outdoors 55.6
Antelope 55.1
Impala 55.1
Wildlife 55.1

Clarifai
created on 2019-02-18

vehicle 98.1
television 98
people 97.5
analogue 97.5
screen 96.6
no person 95.7
city 93.6
transportation system 93.5
one 93.3
museum 93.2
travel 91.7
movie 90.2
music 90.2
analog 90
group 89.4
technology 88.6
landscape 88.4
adult 88
portrait 87.6
street 87.5

Imagga
created on 2019-02-18

gas pump 35
pump 29.6
car 25.9
mechanical device 21.2
model t 19.2
city 19.1
urban 14.8
business 14.6
mechanism 14.3
travel 14.1
architecture 14.1
night 13.3
boat 13.1
device 13
motor vehicle 12.7
water 12.7
black 12
old 11.8
office 11.7
transportation 11.6
building 11.4
sky 10.2
light 10
vehicle 9.9
stall 9.8
monitor 9.4
electronic equipment 9.3
equipment 9.3
ship 9.3
structure 9.2
transport 9.1
container 8.6
tourism 8.2
computer 8.1
reflection 8.1
working 7.9
work 7.8
people 7.8
center 7.7
sign 7.5
ocean 7.5
tramway 7.4
street 7.4

Google
created on 2019-02-18

Microsoft
created on 2019-02-18

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 68-78
Gender Male, 99.9%
Calm 91%
Surprised 6.8%
Fear 6%
Sad 3.8%
Happy 1.2%
Angry 0.8%
Confused 0.8%
Disgusted 0.7%

Feature analysis

Amazon

Baby 96.4%
Person 96.4%
Monitor 85.6%

Categories

Imagga

text visuals 99.9%

Captions

Microsoft
created on 2019-02-18

a screen shot of a television 33.9%
a close up of a screen 33.8%
an old television 30.2%

Text analysis

Amazon

PAN
29
32
RODAR
VLC
DE
92
RODAR THE F PAN T.ILM
F
YUC
T.ILM
>32A
THE
VOZ
V92
3308
TS

Google

→32 →32 A →31A
32
A
31A