Human Generated Data

Title

Untitled (Accra, Ghana)

Date

1975-1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5090

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Accra, Ghana)

People

Artist: Bill Dane, American born 1938

Date

1975-1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5090

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.5
Person 99.5
Person 99.3
Person 98.6
Urban 95.9
Building 94.3
Apparel 90.9
Clothing 90.9
Nature 85.2
Person 84.8
Outdoors 79.5
Person 76.9
Rural 75.6
Countryside 75.6
Shelter 75.6
Person 75
Face 65.5
Accessory 62.9
Accessories 62.9
Tie 62.9
Neighborhood 62.9
Standing 60.2
City 59.2
Town 59.2
Sleeve 56.5
Person 55.9

Clarifai
created on 2019-11-15

people 99.8
adult 98.6
man 98.1
group 96.8
woman 96.1
child 95.8
two 94.7
one 93.5
room 93
portrait 92.2
family 91.6
street 91.2
wear 88.5
boy 86.3
window 84
house 83.8
home 82.5
three 81.6
group together 81.2
couple 81.1

Imagga
created on 2019-11-15

barbershop 77.2
shop 61.2
mercantile establishment 47.7
place of business 31.8
old 25.8
architecture 19.5
building 18.1
hairdresser 16.4
wall 16.2
establishment 15.9
ancient 15.6
city 15
house 14.2
vintage 14.1
antique 13
window 12.7
stone 12.6
door 12.6
people 12.3
man 11.4
travel 11.3
grunge 11.1
street 11
brick 10.4
black 10.2
historic 10.1
male 10
tourism 9.9
religion 9.9
detail 9.6
urban 9.6
historical 9.4
person 9.1
decoration 9.1
dirty 9
history 8.9
statue 8.6
culture 8.5
art 8.5
monument 8.4
aged 8.1
home 8
buildings 7.6
human 7.5
church 7.4
structure 7.3
landmark 7.2
adult 7.2
portrait 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

clothing 97.9
person 97.2
gallery 97.1
text 96
man 93.6
house 84.3
room 83.7
billboard 79.5
scene 71.6
black and white 70.9
smile 70.8
human face 62.2
old 47.5
picture frame 21.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 33-49
Gender Male, 54.9%
Happy 50.3%
Angry 45.1%
Confused 45.1%
Calm 49.2%
Disgusted 45.1%
Fear 45%
Surprised 45.2%
Sad 45.1%

AWS Rekognition

Age 24-38
Gender Male, 54.5%
Sad 46.1%
Calm 50.6%
Angry 47.4%
Fear 45.2%
Happy 45%
Confused 45.1%
Surprised 45.3%
Disgusted 45.1%

AWS Rekognition

Age 16-28
Gender Female, 50.4%
Sad 49.5%
Disgusted 49.6%
Confused 49.5%
Surprised 49.5%
Angry 49.6%
Calm 49.6%
Happy 50.2%
Fear 49.5%

AWS Rekognition

Age 23-37
Gender Male, 50%
Fear 49.5%
Confused 49.5%
Calm 49.7%
Sad 49.6%
Disgusted 49.7%
Happy 49.6%
Surprised 49.5%
Angry 49.9%

Microsoft Cognitive Services

Age 51
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Tie 62.9%

Captions

Microsoft
created on 2019-11-15

a man standing in a room 91%
an old photo of a man in a room 90.9%
an old photo of a man 85.6%

Text analysis

Amazon

good
a
ClubBeer
Ite a good lilfe
lilfe
BED
Ite
OY

Google

Beer
1 good e W Club Beer
1
good
e
W
Club