Human Generated Data

Title

[Street scene, San Francisco, California]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1002.51

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Street scene, San Francisco, California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1002.51

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-12-13

Person 96.4
Human 96.4
Car 85.5
Automobile 85.5
Vehicle 85.5
Transportation 85.5
Machine 79.8
Spoke 63.1
Helmet 61.4
Clothing 61.4
Apparel 61.4
Outdoors 61.3
Alloy Wheel 60.4
Wheel 60.4
Nature 58.7
Icing 56.6
Food 56.6
Dessert 56.6
Cake 56.6
Cream 56.6
Creme 56.6

Clarifai
created on 2023-10-15

monochrome 99.6
no person 97.2
people 96
street 95.3
snow 91.8
transportation system 89.1
retro 88.9
old 87.9
car 87.1
black and white 87.1
vehicle 86.8
square 86.7
one 86.2
dirty 85.6
winter 83.2
monochromatic 81.9
adult 81.8
vintage 80.1
vehicle window 76.9
nostalgia 74.8

Imagga
created on 2021-12-13

home appliance 29.6
sewing machine 26
device 25.8
appliance 22.9
machine 22
textile machine 20.8
old 18.8
grunge 16.2
wall 15.4
water 14.7
durables 14.4
snow 14.4
car 14.3
texture 13.9
cold 13.8
winter 13.6
dirty 13.6
vintage 13.2
iron lung 13.2
travel 12.7
transportation 12.5
industry 11.9
road 11.7
vehicle 11.5
industrial 10.9
frame 10.8
wreckage 10.8
retro 10.6
respirator 10.5
antique 10.4
ice 10.1
city 10
aged 9.9
iron 9.8
container 9.8
metal 9.7
urban 9.6
truck 9.4
architecture 9.4
space 9.3
street 9.2
equipment 8.9
color 8.9
pattern 8.9
mailbox 8.8
building 8.8
damaged 8.6
construction 8.6
grungy 8.5
structure 8.4
house 8.4
box 8.2
breathing device 8.1
detail 8
material 8
art 7.8
ancient 7.8
glass 7.8
broken 7.7
stone 7.7
frost 7.7
concrete 7.7
clean 7.5
light 7.3
part 7.3
border 7.2
black 7.2
cool 7.1
work 7.1

Microsoft
created on 2021-12-13

black and white 89.4
text 83.9
appliance 79.9
old 60.1
white goods 55.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 50-68
Gender Female, 55.5%
Calm 79%
Happy 12.6%
Sad 6.6%
Confused 0.8%
Surprised 0.5%
Angry 0.3%
Fear 0.2%
Disgusted 0.1%

Feature analysis

Amazon

Person 96.4%
Car 85.5%
Helmet 61.4%

Categories

Imagga

paintings art 99.2%

Captions

Microsoft
created on 2021-12-13

an old photo of a person 29.8%
old photo of a person 28.2%

Text analysis

Amazon

4J2642

Google

JAJ2642
JAJ2642