Human Generated Data

Title

[Julia and Lux Feininger by buggies]

Date

1950's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.558.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia and Lux Feininger by buggies]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950's

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.8
Person 99.8
Machine 99.7
Wheel 99.7
Wheel 99.1
Person 98.9
Car 98.6
Automobile 98.6
Transportation 98.6
Vehicle 98.6
Wheel 98.5
Car 96.3
Bicycle 86.9
Bike 86.9
Bicycle 83.3
Carriage 83.1
Spoke 81.3
Wheel 77.7
Wagon 77.1
Car 74.7
Wheel 68.5
Wheel 61.2
Car 60.7
Wheel 59.1
Furniture 55.8
Bicycle 52.7
Person 52.3

Clarifai
created on 2019-11-19

people 100
group together 99.2
group 98.9
adult 98.5
vehicle 98.4
two 98.2
transportation system 97.8
many 96.6
man 96.3
cavalry 96
carriage 95.6
wagon 95.5
several 94.3
military 94
three 93.5
one 92
four 91.3
cart 90.1
woman 89.6
war 86.3

Imagga
created on 2019-11-19

carriage 100
horse cart 52.3
cart 48.4
wagon 40.2
horse 23.5
wheeled vehicle 22.1
vehicle 21.9
old 18.1
farm 17.8
transportation 17
rural 16.7
grass 16.6
travel 15.5
outdoor 15.3
landscape 13.4
field 13.4
animal 12.3
building 12
outdoors 11.9
tractor 11.8
sky 11.5
cannon 11.1
man 10.8
horses 10.7
park 10.7
mountain 10.7
trees 10.7
vacation 10.6
architecture 10.2
transport 10.1
house 10
city 10
road 9.9
machinery 9.7
agriculture 9.7
outside 9.4
hill 9.4
male 9.2
street 9.2
sport 9.1
gun 9
people 8.9
fence 8.8
wheels 8.8
country 8.8
work 8.7
antique 8.7
dirt 8.6
machine 8.5
wheel 8.5
tree 8.5
vintage 8.3
historic 8.3
countryside 8.2
history 8.1
sand 8
wild 7.8
roof 7.6
two 7.6
car 7.6
drive 7.6
speed 7.3
weapon 7.3
industrial 7.3
hay 7.2
sunset 7.2

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

building 99
outdoor 95.3
drawn 94.9
horse 93.6
carriage 90.1
wheel 89.4
pulling 87.7
text 83.8
land vehicle 81.6
black and white 74.4
vehicle 73.1
cart 41.5
pulled 32.9

Face analysis

Amazon

AWS Rekognition

Age 19-31
Gender Male, 53.9%
Fear 45%
Angry 45.6%
Calm 53.8%
Surprised 45.3%
Happy 45%
Confused 45%
Disgusted 45%
Sad 45.1%

Feature analysis

Amazon

Person 99.8%
Wheel 99.7%
Car 98.6%
Bicycle 86.9%

Captions

Microsoft

a person riding a horse drawn carriage in front of a building 96%
a person riding a horse drawn carriage 95.9%
a person riding a horse drawn carriage in front of a building 93.9%

Text analysis

Amazon

CREAM
ERRGONGS
ICE CREAM
ICE
ERRG

Google

ERRGONGS
CREAM
ERRG
CE
ERRGONGS ICE CREAM ERRG CE
ICE