Human Generated Data

Title

Untitled (street scenes, Nazaré, Portugal)

Date

1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.563

Human Generated Data

Title

Untitled (street scenes, Nazaré, Portugal)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.563

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Clothing 90.7
Apparel 90.7
Face 87.7
Text 77.5
Advertisement 76
Collage 70.8
Female 69.2
Symbol 66.3
Number 66.3
Word 65
Pet 64.7
Animal 64.7
Cat 64.7
Mammal 64.7
Dog 60.8
Canine 60.8
Cat 60.5
Page 60.3
Poster 53.7

Clarifai
created on 2019-08-10

people 98.6
monochrome 96.9
adult 94.7
one 94.6
man 94.4
vertical 91.6
woman 91.2
wear 91.1
indoors 87.5
retro 86.6
no person 82.2
paper 81.7
window 80.5
desktop 79.7
dirty 79.5
portrait 77.4
business 77.3
child 76.5
art 76.1
music 75.9

Imagga
created on 2019-08-10

computer 28.3
business 27.9
office 27.3
film 25.2
work 25.1
technology 23.7
laptop 22.9
screen 21.9
negative 21.7
background 21.5
working 19.4
keyboard 19.4
monitor 19.4
newspaper 18.6
photographic paper 18
display 17.7
device 16.4
people 16.2
hand 15.9
person 15.8
locker 15
corporate 14.6
professional 14.3
adult 14.2
paper 14.1
communication 13.4
job 13.3
indoors 13.2
smile 12.8
product 12.6
notebook 12.5
equipment 12.4
fastener 12.2
photographic equipment 12
data 11.9
male 11.3
desk 11.3
attractive 11.2
money 11.1
portrait 11
happy 10.7
modern 10.5
pretty 10.5
looking 10.4
home 10.4
sitting 10.3
creation 10.2
finance 10.1
occupation 10.1
man 10.1
information 9.7
smiling 9.4
lifestyle 9.4
face 9.2
note 9.2
restraint 9.1
electronic device 9.1
human 9
education 8.7
table 8.7
cute 8.6
wireless 8.6
close 8.6
manager 8.4
closeup 8.1
success 8
financial 8
worker 8
businessman 7.9
x-ray film 7.9
black 7.8
typing 7.8
model 7.8
writing 7.5
house 7.5
network 7.5
key 7.5
one 7.5
document 7.4
executive 7.4
successful 7.3
student 7.2
women 7.1
interior 7.1

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-52
Gender Male, 53.6%
Calm 45.4%
Angry 48.9%
Happy 45.6%
Sad 45.9%
Disgusted 45%
Fear 47.6%
Confused 45.4%
Surprised 46.3%

Feature analysis

Amazon

Cat 64.7%
Dog 60.8%
Poster 53.7%

Categories

Imagga

paintings art 86.6%
pets animals 11.7%
food drinks 1.1%

Text analysis

Amazon

PLUS
KODAK
FILM
PAN
PLUS KODAK FILM X PAN
>4A
->5A
>3A
>2A ->4 >3A >4A ->5A
>2A
X
->4
n

Google

KODAK PLUS X PAN FILM 2 A 3A 4A 5 >5A
KODAK
PLUS
X
PAN
FILM
2
A
3A
4A
5
>5A