Human Generated Data
Title
[Lyonel Feininger and Galka Scheyer, Hollywood, California]
Date
1936
People
Artist: Unidentified Artist,
Classification
Photographs
Credit Line
Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.142.13
Human Generated Data
Title
[Lyonel Feininger and Galka Scheyer, Hollywood, California]
People
Artist: Unidentified Artist,
Date
1936
Classification
Photographs
Credit Line
Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.142.13
Machine Generated Data
Tags
Amazon
created on 2021-04-04
Person
99.5
Human
99.5
Person
97.7
Ground
81.9
Clothing
81.8
Apparel
81.8
Nature
80
Outdoors
78.5
Soil
77.3
Plant
72.6
Tree
61.7
Brick
57.5
Road
56.4
Chair
55.4
Furniture
55.4
Clarifai
created on 2021-04-04
people
99.7
monochrome
97.4
street
96
man
95.9
art
95.3
woman
94.6
adult
94
black and white
92.5
child
90.4
shadow
89.5
two
88.4
group
87.4
analogue
84.4
group together
82.7
light
79.3
portrait
78.6
mono
78.5
abandoned
77.3
boy
76.1
wear
75.9
Imagga
created on 2021-04-04
kin
38
sunset
25.2
silhouette
22.4
man
20.8
groom
20.8
people
20.6
person
19.4
landscape
14.9
sky
14.7
outdoor
13.8
black
13.6
male
13.5
sun
12.9
outdoors
12.7
water
12.7
dark
11.7
summer
11.6
forest
11.3
men
11.2
world
11.1
adult
10.5
love
10.3
beach
10.1
child
10
dusk
9.5
evening
9.3
field
9.2
ocean
9.1
old
9.1
sunlight
8.9
couple
8.7
light
8.7
happiness
8.6
walking
8.5
free
8.4
sunrise
8.4
relax
8.4
park
8.2
freedom
8.2
sea
7.8
travel
7.7
outside
7.7
tree
7.7
two
7.6
leisure
7.5
sport
7.4
danger
7.3
family
7.1
night
7.1
sax
7.1
Google
created on 2021-04-04
Plant
91.1
Black-and-white
86.4
Wood
84.1
Style
84
Adaptation
79.4
Art
78.4
Terrestrial plant
78.4
Grass
77.8
Tints and shades
77.3
Monochrome photography
77
Trunk
76.6
Monochrome
76.2
Font
69.6
Room
69.2
Tree
67.1
Chair
66.9
Visual arts
65.4
Darkness
63.1
Arecales
62.9
Stock photography
62.9
Microsoft
created on 2021-04-04
black and white
96.4
monochrome
88.2
clothing
87.4
person
79.4
text
74.7
man
72.9
grave
59.5
fireplace
47.9
Color Analysis
Face analysis
Amazon
AWS Rekognition
Age
25-39
Gender
Male, 57.1%
Happy
49.3%
Calm
24%
Fear
12.3%
Sad
5.2%
Angry
5.1%
Disgusted
1.6%
Surprised
1.3%
Confused
1.2%
Feature analysis
Amazon
Person
Chair
❮
❯
Person
99.5%
❮
❯
Chair
55.4%
Categories
Imagga
streetview architecture
47.6%
interior objects
14.4%
paintings art
12.2%
text visuals
8.7%
events parties
7.8%
pets animals
3.8%
nature landscape
2.3%
people portraits
1.6%
Captions
Microsoft
created on 2021-04-04
a person standing next to a fireplace
70.9%
a person standing in front of a fireplace
69.4%
a group of people standing next to a fireplace
59.3%