Human Generated Data

Title

Adler Limousine, 1930-1933

Date

c. 1930-1933

People

Artist: Unidentified Artist,

Classification

Archival Material

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Walter Gropius, BRGA.34.55

Human Generated Data

Title

Adler Limousine, 1930-1933

People

Artist: Unidentified Artist,

Date

c. 1930-1933

Classification

Archival Material

Machine Generated Data

Tags

Amazon
created on 2019-05-31

Transportation 99.4
Vehicle 99.4
Automobile 99.4
Antique Car 99.2
Hot Rod 96.4
Car 96.3
Model T 90.6
Tire 84.6
Person 82.5
Human 82.5
Spoke 82.4
Machine 82.4
Person 76.3
Person 74.2
Wheel 70.2
Light 66
Sports Car 64.3
Coupe 64.3
Person 64.2
Headlight 59.1
Person 59.1
Car Wheel 57.9

Clarifai
created on 2019-05-31

vehicle 99.2
classic 98.6
retro 98.6
transportation system 98.2
car 98.1
vintage 98
nostalgia 97.8
old 97.2
monochrome 96.1
machine 95.5
antique 93.3
no person 92.5
exhibition 92
wheel 90.8
chrome 90.5
luxury 89.3
people 88.8
two 87.2
analogue 86.6
industry 86.4

Imagga
created on 2019-05-31

washer 100
home appliance 98.8
white goods 80.9
appliance 77
toaster 41.6
durables 35
kitchen appliance 34.2
machine 31.1
equipment 29.1
technology 27.4
device 21.1
old 16
car 15.8
sewing machine 15.5
camera 14.8
object 13.9
seat 13.2
retro 13.1
washing 12.6
lens 12.6
textile machine 12.4
3d 12.4
antique 12.2
vehicle 12.2
metal 12.1
classic 12.1
vintage 11.6
obsolete 11.5
auto 11.5
wheel 11.3
home 11.2
clean 10.8
laundry 10.8
wash 10.6
modern 10.5
computer 10.4
electronic 10.3
transportation 9.9
housework 9.8
support 9.8
digital 9.7
business 9.7
photograph 9.6
automobile 9.6
domestic 9
dirty 9
black 9
indoors 8.8
film 8.6
clothes 8.4
focus 8.3
printer 8.3
inside 8.3
room 8.2
clothing 8
interior 8
industry 7.7
studio 7.6
drive 7.6
plastic 7.3
office 7.2
work 7.1

Google
created on 2019-05-31

Microsoft
created on 2019-05-31

vehicle 85.8
land vehicle 85.7
auto part 85.4
car 78.5
wheel 69.6
black and white 56.8

Face analysis

Amazon

AWS Rekognition

Age 17-27
Gender Female, 50.9%
Disgusted 45.8%
Confused 45.2%
Sad 49.9%
Happy 45.3%
Angry 47.1%
Surprised 45.3%
Calm 46.4%

Feature analysis

Amazon

Car 96.3%
Person 82.5%

Captions

Microsoft

a white car in front of a store 54%
a car parked in front of a store 43.2%
a store inside of a car 43.1%

Text analysis

Amazon

18
16
ANAIA