Human Generated Data

Title

Untitled (women and children in street in front of building, Nazaré, Portugal)

Date

1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.554.2

Human Generated Data

Title

Untitled (women and children in street in front of building, Nazaré, Portugal)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.554.2

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Person 99.5
Human 99.5
Pedestrian 95.7
Building 95.2
Person 91.2
Architecture 91.2
Shorts 85.6
Clothing 85.6
Apparel 85.6
Bicycle 83.4
Transportation 83.4
Bike 83.4
Vehicle 83.4
Temple 81.5
Person 76.2
Crowd 70.2
Worship 68.1
Shrine 68.1
People 67.8
Tarmac 65.8
Asphalt 65.8
Sport 63.5
Cricket 63.5
Sports 63.5
Path 58.3

Clarifai
created on 2019-08-09

people 99.6
adult 99
street 98.7
group together 96.7
man 95.1
one 94.4
monochrome 92.6
road 91.9
group 91.8
woman 91.7
two 90.4
wear 87.3
transportation system 86
shadow 80.1
four 79.6
three 79.2
child 78.7
administration 78.5
several 78.4
light 77.9

Imagga
created on 2019-08-09

building 22
man 20.2
street 19.3
city 19.1
urban 18.3
architecture 18.2
sport 15.5
road 15.3
outdoor 15.3
people 14.5
structure 14.1
business 13.4
dark 13.4
park 13.2
house 12.9
athlete 12.7
travel 12.7
spectator 12.6
old 12.5
lifestyle 12.3
outdoors 11.9
competition 11.9
person 11.5
walking 11.4
hall 11.3
sports 11.1
recreation 10.8
shop 10.7
run 10.6
track 10.6
ball 10.1
tree 10
active 10
exercise 10
leisure 10
night 9.8
tennis 9.7
world 9.5
speed 9.2
transportation 9
room 8.9
court 8.8
brick 8.7
school 8.7
grass 8.7
day 8.6
empty 8.6
construction 8.6
summer 8.4
sign 8.3
fitness 8.1
sidewalk 8
home 8
male 7.9
running 7.7
garage 7.5
residence 7.4
playing 7.3
new 7.3
lamp 7.3

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

text 95.8
outdoor 95.5
black and white 90.8
dog 70.9
black 69.3
white 64.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Male, 50.1%
Calm 49.8%
Disgusted 49.5%
Sad 49.6%
Confused 49.5%
Angry 49.8%
Surprised 49.6%
Happy 49.6%
Fear 49.5%

AWS Rekognition

Age 19-31
Gender Male, 50.2%
Happy 49.5%
Angry 49.6%
Surprised 49.5%
Disgusted 49.5%
Fear 49.6%
Calm 49.6%
Sad 50%
Confused 49.6%

AWS Rekognition

Age 38-56
Gender Female, 50.3%
Confused 49.5%
Calm 49.5%
Angry 49.5%
Sad 50.1%
Surprised 49.5%
Fear 49.9%
Happy 49.5%
Disgusted 49.5%

AWS Rekognition

Age 30-46
Gender Female, 50.3%
Confused 49.5%
Sad 49.9%
Surprised 49.6%
Calm 49.6%
Disgusted 49.6%
Happy 49.6%
Fear 49.6%
Angry 49.5%

Feature analysis

Amazon

Person 99.5%
Bicycle 83.4%

Categories

Text analysis

Amazon

Royco
TREVIRA
HAG

Google

lene TREVIRA
lene
TREVIRA