Human Generated Data

Title

[Unidentified person driving]

Date

1940-1945

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.18

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Unidentified person driving]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940-1945

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.18

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-12-13

Face 92
Human 92
Person 85.5
Clothing 83.6
Apparel 83.6
Skin 82.9
Airplane 82.5
Transportation 82.5
Vehicle 82.5
Aircraft 82.5
Label 72.6
Text 72.6
Furniture 70.2
Neck 66.7
Poster 66.4
Advertisement 66.4
Head 65.2
Collage 64.6
Car 63.3
Automobile 63.3
Word 59.7

Clarifai
created on 2023-10-15

vehicle 99.3
one 98.9
people 98.9
watercraft 97.6
transportation system 97.2
adult 96.5
no person 96
aircraft 95.4
two 92.7
man 89.6
military 89.6
airplane 88.2
war 87.1
administration 83.5
group 78.5
home 78.4
portrait 75.3
vintage 75.1
retro 75
ship 74

Imagga
created on 2021-12-13

missile 49.9
weapon 47.2
rocket 40.7
instrument 29.8
architecture 29.4
device 27.9
conveyance 23.5
old 20.2
building 20.2
negative 19.2
film 16.1
city 15.8
urban 15.7
industrial 15.4
cannon 14.3
light 14.2
travel 13.4
stone 12.7
ancient 12.1
man 12.1
industry 11.9
photographic paper 11.7
sky 11.5
structure 11.4
wall 11.4
construction 11.1
tourism 10.7
gun 10.4
history 9.8
metal 9.7
factory 9.6
grunge 9.4
house 9.2
vintage 9.1
dirty 9
black 9
steel 8.8
brick 8.7
water 8.7
art 8.5
wood 8.3
traditional 8.3
window 8.2
aged 8.1
vehicle 7.9
design 7.9
work 7.8
photographic equipment 7.8
step 7.7
culture 7.7
concrete 7.7
door 7.6
dark 7.5
monument 7.5
smoke 7.4
town 7.4
street 7.4
car 7.3
detail 7.2
transportation 7.2
modern 7

Microsoft
created on 2021-12-13

black and white 87
text 82.9
vehicle 53.3
person 50.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 87.1%
Calm 98.2%
Happy 0.7%
Sad 0.4%
Confused 0.2%
Surprised 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0%

Feature analysis

Amazon

Person 85.5%
Airplane 82.5%
Car 63.3%

Captions