Human Generated Data

Title

Untitled (Mexico)

Date

1976

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5102

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Mexico)

People

Artist: Bill Dane, American born 1938

Date

1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5102

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99
Person 99
Person 91.5
Shop 79
Food 73.6
Meal 73.6
Pub 70.2
Person 69.7
Bar Counter 68.5
Indoors 68.2
Fireplace 68.2
Person 65.2
Person 63.3
People 62
Person 61.9
Plant 61.4
Restaurant 60.1
Deli 58.7
Market 57.5
Bazaar 57.5
Cafeteria 56.4
Urban 56.1
Worker 55.8

Clarifai
created on 2019-11-15

people 99.9
group 98.9
many 98.8
adult 98.6
one 97.4
man 96.5
two 96.3
group together 93.5
no person 93.4
wear 90.5
vehicle 89.1
container 85.8
woman 84
leader 83.3
religion 82.8
military 81.9
art 81.4
veil 79.5
home 78.4
war 77.7

Imagga
created on 2019-11-15

architecture 38
city 29.1
travel 25.3
building 25.3
tourism 24.7
structure 19.6
old 18.1
history 17
ancient 16.4
monument 15.9
roof 15.5
balcony 15.2
night 15.1
palace 14.7
culture 14.5
sculpture 13.6
fountain 13.4
landmark 12.6
house 12.3
urban 12.2
famous 12.1
capital 11.4
art 11.1
historic 11
shop 11
sky 10.2
temple 10
tourist 10
religion 9.8
cathedral 9.8
attraction 9.5
decoration 9.5
cityscape 9.5
facade 9.3
place 9.3
church 9.2
square 9
detail 8.8
mercantile establishment 8.7
holiday 8.6
percussion instrument 8.6
statue 8.6
musical instrument 8.5
dome 8.5
winter 8.5
stall 8.4
traditional 8.3
street 8.3
chandelier 8.3
stone 7.7
shoe shop 7.6
destination 7.5
town 7.4
exterior 7.4
vacation 7.4
light 7.3

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 87.9
text 85.8
black and white 80
clothing 78.8
man 64.1
food 56.8
several 10.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-38
Gender Male, 54.8%
Sad 45%
Fear 45%
Angry 45.1%
Surprised 45%
Calm 45.2%
Disgusted 45%
Confused 45%
Happy 54.6%

Microsoft Cognitive Services

Age 35
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Fireplace 68.2%

Text analysis

Amazon

as
bad
ht
DawiD

Google

as NA 0awiD
as
NA
0awiD