Human Generated Data

Title

Untitled (Berkeley)

Date

1980

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5217

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Berkeley)

People

Artist: Bill Dane, American born 1938

Date

1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5217

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99
Human 99
Building 98.4
Factory 96.7
Machine 88.9
Person 88.7
Wheel 87.1
Furniture 86.3
Chair 86.3
Person 83.2
Clinic 78.8
Workshop 59.6
Assembly Line 57

Clarifai
created on 2019-11-15

people 98.8
grinder 97.9
industry 96
indoors 94.5
group together 92.7
adult 92.5
production 92.3
man 91.2
group 88.6
vehicle 87.5
room 86
transportation system 81.3
business 80.7
technology 79.1
conveyer belt 76.3
warehouse 75.7
machine 74.2
many 73.8
commerce 72.8
steel 69.5

Imagga
created on 2019-11-15

industry 37.6
industrial 31.8
modern 31.5
factory 30.9
cyclotron 30.7
interior 30.1
building 29.1
steel 28.7
equipment 25.4
power 25.2
accelerator 24.7
engineering 22.8
plant 20.1
architecture 19.5
pipe 19.4
structure 18.6
scientific instrument 18.5
device 17.8
metal 17.7
inside 17.5
urban 17.5
tube 17.4
heavy 17.2
business 17
pipes 16.7
instrument 15.9
machine 15.8
station 15.7
manufacturing 15.6
technology 15.6
room 15.5
energy 15.1
mechanical 14.6
furniture 14.1
restaurant 13.9
production 13.6
gas 13.5
pollution 13.5
work 13.3
valve 13.2
piping 12.8
pump 12.7
house 12.5
design 12.4
chair 12
machinery 11.7
system 11.4
light 11.4
oil 11.1
dairy 11.1
office 10.7
waste 10.7
reflection 10.6
indoors 10.5
estate 10.4
construction 10.3
people 10
pipeline 9.8
mechanic 9.8
turbine 9.7
steam 9.7
fuel 9.6
dishwasher 9.6
table 9.5
men 9.4
water 9.3
window 9.3
heat 9.3
clean 9.2
transportation 9
science 8.9
refinery 8.9
shop 8.9
tank 8.8
pressure 8.7
complex 8.7
stainless 8.7
technical 8.7
chemical 8.7
hall 8.6
luxury 8.6
electricity 8.5
stock 8.4
city 8.3
environment 8.2
indoor 8.2
new 8.1
home 8
decor 8
fabricate 7.9
enterprise 7.9
engineer 7.8
white goods 7.8
petrol 7.8
radiation 7.8
process 7.8
high 7.8
glass 7.8
district 7.8
3d 7.7
supply 7.7
lamp 7.7
engine 7.7
wall 7.7
contrast 7.7
apartment 7.7
health 7.6
control 7.6
hot 7.5
electric 7.5
floor 7.4
lines 7.2

Google
created on 2019-11-15

Factory 72.2
Machine 60.8
Building 59.5
Toolroom 53.6

Microsoft
created on 2019-11-15

text 94
indoor 88.9
black and white 79
person 72.1
ladder 68.9
white 61

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 46-64
Gender Female, 50.1%
Happy 49.5%
Calm 49.5%
Disgusted 49.5%
Surprised 49.5%
Fear 49.7%
Sad 50.3%
Confused 49.5%
Angry 49.5%

AWS Rekognition

Age 26-42
Gender Male, 50.4%
Happy 49.7%
Angry 49.8%
Confused 49.7%
Calm 49.7%
Disgusted 49.5%
Fear 49.5%
Surprised 49.5%
Sad 49.6%

Feature analysis

Amazon

Person 99%
Wheel 87.1%
Chair 86.3%

Categories

Text analysis

Google

मरत
मरत