Human Generated Data

Title

[Andreas and Tomas Feininger]

Date

1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.77

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Andreas and Tomas Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.77

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-12-13

Person 98.8
Human 98.8
Clinic 92.7
Indoors 78
Room 78
Hospital 63.7
Doctor 59.3
Nurse 58.8
Interior Design 56.8
Operating Theatre 56.8

Clarifai
created on 2023-10-15

people 99.9
one 99.5
adult 98.5
two 95.3
man 95.2
administration 94.9
room 94.6
portrait 93.2
child 93
sit 90.5
furniture 87.9
wear 87.3
war 85.2
leader 84.7
education 83.4
retro 83.3
group 82.7
medical practitioner 81.2
indoors 80.9
three 80.9

Imagga
created on 2021-12-13

negative 71.6
film 58.8
photographic paper 40.3
grunge 31.5
old 31.3
blackboard 28.8
vintage 27.3
photographic equipment 26.8
texture 26.4
aged 24.4
antique 23.4
grungy 21.8
wall 20.5
ancient 19.9
retro 19.7
frame 16.6
space 16.3
paper 16
rough 15.5
text 14.8
art 14.4
border 13.6
blank 12.9
dirty 12.6
material 12.5
damaged 12.4
design 12.4
textured 12.3
black 12
decorative 11.7
television 11.5
worn 11.5
pattern 10.9
parchment 10.5
weathered 10.4
empty 10.3
paint 10
classroom 9.6
person 9.5
canvas 9.5
people 9.5
machine 9.2
cash machine 9.2
gray 9
brown 8.8
telecommunication system 8.7
screen 8.6
camera 8.3
digital 8.1
computer 8
burned 7.8
scratched 7.8
manuscript 7.8
frames 7.8
burnt 7.8
edge 7.7
stained 7.7
dirt 7.6
device 7.6
sign 7.5
decoration 7.5
water 7.3
historic 7.3
graphic 7.3
business 7.3
color 7.2
portrait 7.1
room 7.1

Google
created on 2021-12-13

Microsoft
created on 2021-12-13

text 95.9
clothing 88.4
person 87.1
human face 82.4
black and white 69.1
white 62
house 60.6
old 59.2
man 51.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Female, 52.1%
Fear 51.1%
Sad 18.6%
Surprised 13.8%
Calm 6.5%
Happy 6.5%
Angry 1.9%
Confused 1%
Disgusted 0.6%

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft
created on 2021-12-13

an old photo of a person 83.9%
old photo of a person 82%
an old photo of a person 80.6%