Human Generated Data

Title

Adler Limousine, 1930-1933

Date

c. 1930-1933

People

Artist: Unidentified Artist,

Classification

Archival Material

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Walter Gropius, BRGA.34.81

Human Generated Data

Title

Adler Limousine, 1930-1933

People

Artist: Unidentified Artist,

Date

c. 1930-1933

Classification

Archival Material

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Walter Gropius, BRGA.34.81

Machine Generated Data

Tags

Amazon
created on 2019-05-31

Human 99.2
Person 99.2
Furniture 94.1
Person 89.4
Apparel 86.6
Clothing 86.6
Vehicle 81.8
Automobile 81.8
Car 81.8
Transportation 81.8
Overcoat 72.6
Coat 72.6
Suit 72.6
Couch 69.9
Bed 58.8
Machine 56.4
Spoke 56.4

Clarifai
created on 2019-05-31

people 99.8
adult 98.6
one 98.4
furniture 98
man 96.3
vehicle 96.2
two 94.6
woman 93.6
wear 92.1
transportation system 91.2
chair 91
retro 90.7
lid 90
room 87.6
leader 87.1
seat 86.5
group 86.2
administration 86
family 85.9
monochrome 85.1

Imagga
created on 2019-05-31

furniture 41.1
room 33.2
home appliance 29.4
home 27.1
interior 26.5
house 25.9
appliance 25.2
kitchen appliance 24.3
modern 23.1
table 22.4
equipment 22.3
box 20.7
kitchen 19.8
wood 17.5
toaster 17.4
decor 16.8
floor 16.7
office 16.7
bedroom 16.1
inside 15.6
indoor 15.5
microwave 15.4
computer 14.9
dishwasher 14.6
apartment 14.4
desk 14.3
indoors 14
car 14
printer 13.6
luxury 12.9
cabinet 12.8
white goods 12.8
work 12.5
3d 12.4
monitor 12.3
file 12.2
man 12.1
apparatus 12
window 11.9
hospital 11.9
furnishing 11.8
people 11.7
chair 11.5
lamp 11.4
machine 11.4
design 11.2
photocopier 11.1
business 10.9
duplicator 10.6
estate 10.4
technology 10.4
container 10.2
nobody 10.1
clean 10
vehicle 9.7
new 9.7
wooden 9.7
decoration 9.4
lifestyle 9.4
refrigerator 9.2
durables 9.2
working 8.8
device 8.8
pillow 8.7
laptop 8.6
architecture 8.6
glass 8.6
seat 8.5
contemporary 8.5
professional 8.4
horizontal 8.4
sofa 8.1
transportation 8.1
light 8
crate 8
steel 7.9
male 7.8
render 7.8
stainless 7.7
bed 7.7
old 7.7
hotel 7.6
relax 7.6
rest 7.5
relaxation 7.5
crib 7.5
person 7.3
domestic 7.2
shelf 7.1
oven 7.1
copy 7.1
travel 7

Google
created on 2019-05-31

Microsoft
created on 2019-05-31

vehicle 88.2
old 87.1
land vehicle 82.1
black 72.4
vintage 31

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Angry 45.2%
Happy 45.1%
Sad 45.2%
Disgusted 45.1%
Confused 45.1%
Calm 54.2%
Surprised 45.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

interior objects 99.7%

Captions

Microsoft
created on 2019-05-31

a vintage photo of a man 82.9%
a vintage photo of a truck 73.6%
an old photo of a man 73.5%