Human Generated Data
Title
Untitled (men preparing fishing nets, Nazaré, Portugal)
Date
1967
People
Artist: Gordon W. Gahan, American 1945 - 1984
Classification
Photographs
Credit Line
Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.546.3
Human Generated Data
Title
Untitled (men preparing fishing nets, Nazaré, Portugal)
People
Artist: Gordon W. Gahan, American 1945 - 1984
Date
1967
Classification
Photographs
Credit Line
Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.546.3
Machine Generated Data
Tags
Amazon
created on 2019-08-09
Person
99.5
Human
99.5
Person
99.3
Person
98.8
Person
98.2
Person
96.4
Apparel
80.8
Clothing
80.8
Helmet
74.9
People
67.6
Crowd
64.9
Transportation
63
Vehicle
63
Mosquito Net
60.2
Food
59.7
Meal
59.7
Car
58.2
Automobile
58.2
Leisure Activities
57.7
Face
56.4
Text
56
Spoke
55.8
Machine
55.8
Clarifai
created on 2019-08-09
people
100
group
98.8
adult
96.9
child
96.9
group together
96.1
two
93
music
92.5
man
92.1
actor
91.5
administration
89.9
wear
89.1
woman
88.6
furniture
87.3
military
86.5
vehicle
86.1
several
84.3
many
83.9
one
83.8
musician
83.8
three
82.8
Imagga
created on 2019-08-09
people
16.2
cleaning implement
15.9
broom
14.2
water
14
child
13.8
man
13.4
fountain
13.3
old
13.2
happy
13.2
structure
12.3
black
12
happiness
11.7
travel
11.3
groom
11.2
dress
10.8
smile
10.7
cleaner
10.6
wall
10.3
smiling
10.1
loom
10
outdoor
9.9
umbrella
9.2
adult
9.1
device
9
outdoors
9
activity
9
building
8.8
couple
8.7
machine
8.7
love
8.7
lifestyle
8.7
sitting
8.6
joy
8.3
dark
8.3
leisure
8.3
human
8.2
cheerful
8.1
world
8.1
wet
8
light
8
art
7.9
piano
7.9
holiday
7.9
urban
7.9
musical instrument
7.8
portrait
7.8
window
7.5
fun
7.5
traditional
7.5
canopy
7.5
work
7.5
alone
7.3
dirty
7.2
person
7.2
women
7.1
face
7.1
Google
created on 2019-08-09
Photograph
96.7
Snapshot
86.4
Stock photography
79
Photography
72.5
Painting
68.7
Art
62.5
Black-and-white
56.4
Vehicle
51.9
Microsoft
created on 2019-08-09
text
97
person
96
clothing
91.6
black and white
82.2
Color Analysis
Face analysis
Amazon
AWS Rekognition
Age
13-23
Gender
Male, 53.6%
Confused
45.1%
Angry
46.7%
Sad
45.5%
Surprised
45.5%
Happy
46.1%
Disgusted
45.3%
Calm
50.5%
Fear
45.3%
Feature analysis
Amazon
Person
Helmet
❮
❯
Person
99.5%
❮
❯
Helmet
74.9%
Categories
Imagga
paintings art
68.1%
pets animals
10.5%
people portraits
10.5%
interior objects
9%
Captions
Microsoft
created on 2019-08-09
a group of people standing in front of a window
83.1%
a group of people sitting in front of a window
74.2%
a group of people posing for a photo in front of a window
74.1%
Text analysis
Amazon
3
88 3
88