Human Generated Data

Title

[Lyonel Feininger, Deep]

Date

1930-1931

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.42.6

Human Generated Data

Title

[Lyonel Feininger, Deep]

People

Artist: Unidentified Artist,

Date

1930-1931

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.42.6

Machine Generated Data

Tags

Amazon
created on 2022-06-03

Human 98.6
Person 98.6
Military 97
Officer 97
Military Uniform 97
Person 89.7
Shoe 88.4
Footwear 88.4
Clothing 88.4
Apparel 88.4
Captain 77.2
Nature 74.2
Shoe 70.3
Outdoors 63.3
Coat 56.9

Imagga
created on 2022-06-03

fountain 100
structure 71.1
negative 26.8
film 21.3
water 20.7
black 16.3
dirty 15.4
old 15.3
grunge 15.3
river 14.2
city 13.3
texture 13.2
art 13
photographic paper 12.8
travel 12.7
landscape 12.6
park 12.3
pattern 11.6
blackboard 11.5
ashcan 11.4
paint 10.9
building 10.4
antique 10.4
dark 10
outdoor 9.9
vintage 9.9
digital 9.7
outdoors 9.7
mask 9.6
color 9.5
light 9.4
frame 9.2
bin 9.1
environment 9
border 9
container 9
retro 9
design 9
text 8.7
forest 8.7
fog 8.7
scene 8.7
photographic equipment 8.5
splash 8.4
stone 8.4
sky 8.3
wet 8
architecture 7.8
space 7.8
rust 7.7
tree 7.7
damaged 7.6
grungy 7.6
canvas 7.6
power 7.6
graphic 7.3
rough 7.3
industrial 7.3
trees 7.1
night 7.1
summer 7.1

Google
created on 2022-06-03

Microsoft
created on 2022-06-03

text 98.1
outdoor 96.9
man 93.9
black and white 93.2
person 88.2
water 68.2
drawing 57.4
old 51.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 47-53
Gender Male, 63.7%
Calm 92.1%
Fear 6.4%
Surprised 6.3%
Sad 4.7%
Happy 0.2%
Disgusted 0.2%
Confused 0.2%
Angry 0.2%

Feature analysis

Amazon

Person 98.6%
Shoe 88.4%

Text analysis

Google