Human Generated Data

Title

Untitled (Alameda)

Date

1976

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5106

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Alameda)

People

Artist: Bill Dane, American born 1938

Date

1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5106

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.3
Human 99.3
Person 95.7
Sitting 94.3
Person 89.6
Person 87.3
Machine 82.3
Wheel 82.3
Apparel 81.5
Footwear 81.5
Clothing 81.5
Shoe 81.5
Automobile 79.5
Vehicle 79.5
Transportation 79.5
Car 79.5
Person 77.9
Flooring 75.5
Undershirt 74.4
Truck 71.3
Furniture 70.6
Couch 70.6
Cafeteria 69.2
Restaurant 69.2
Person 65
Food 64.6
Meal 64.6
Car 63
People 58.5
Market 57.6
Shoe 54.7
Chair 50.7

Clarifai
created on 2019-11-15

people 99.9
group 98.5
adult 98.2
group together 97.6
many 97.4
woman 95.7
vehicle 94.4
man 91
child 89.7
war 89.5
several 89.3
furniture 89.2
administration 88.9
wear 87.3
street 87.2
military 86.4
transportation system 86.1
music 85
chair 84.7
leader 83.7

Imagga
created on 2019-11-15

room 29.1
interior 28.3
chair 27.3
man 26.2
table 26.1
home 23.1
people 22.3
seller 20.8
male 20.6
person 20.1
kitchen 18.9
classroom 16.5
lifestyle 15.9
house 15.9
indoors 15.8
couple 15.7
food 15.3
restaurant 14.8
adult 14.4
modern 14
sitting 13.7
teacher 13.3
boy 13
inside 12.9
happy 12.5
smiling 12.3
cheerful 12.2
floor 12.1
women 11.9
counter 11.4
dinner 11.4
design 11.2
life 11.1
furniture 11
stove 10.9
meal 10.9
drink 10.8
family 10.7
dining 10.5
outdoors 10.4
glass 10.1
mother 10
leisure 10
business 9.7
together 9.6
men 9.4
cafeteria 9.3
lunch 9.1
indoor 9.1
group 8.9
cooking 8.7
class 8.7
day 8.6
student 8.5
oven 8.5
shop 8.5
contemporary 8.5
waiter 8.4
mature 8.4
wood 8.3
wine 8.3
fun 8.2
teenager 8.2
child 8.2
work 8.1
refrigerator 7.9
urban 7.9
smile 7.8
happiness 7.8
chairs 7.8
standing 7.8
architecture 7.8
students 7.8
party 7.7
casual 7.6
fashion 7.5
city 7.5
holding 7.4
technology 7.4
seat 7.4
coffee 7.4
building 7.3
cook 7.3
decoration 7.2
holiday 7.2
handsome 7.1
board 7.1
kid 7.1
decor 7.1

Google
created on 2019-11-15

Motor vehicle 97.8
Snapshot 84.9
Black-and-white 80.7
Vehicle 76.5
Monochrome 75.8
Car 71.7
Photography 67.8
Street 63.4
Style 53.5
Street food 53.5

Microsoft
created on 2019-11-15

clothing 98
person 95.9
black and white 92.2
table 88.1
woman 86.5
text 74.8
man 73.1
furniture 22.5

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 5-15
Gender Female, 54.4%
Fear 45.1%
Disgusted 45%
Calm 47.7%
Happy 45%
Surprised 45%
Angry 45%
Sad 52.2%
Confused 45%

AWS Rekognition

Age 9-19
Gender Female, 54.1%
Calm 45%
Sad 54.6%
Angry 45%
Disgusted 45%
Happy 45%
Surprised 45%
Fear 45.3%
Confused 45%

AWS Rekognition

Age 23-35
Gender Male, 50.3%
Surprised 49.6%
Sad 49.6%
Fear 49.5%
Angry 49.7%
Happy 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 50%

Microsoft Cognitive Services

Age 13
Gender Female

Feature analysis

Amazon

Person 99.3%
Wheel 82.3%
Shoe 81.5%
Car 79.5%
Truck 71.3%
Chair 50.7%

Text analysis

Google

AL
AL