Human Generated Data

Title

Untitled (matador standing by wall of bullring)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.510.1

Human Generated Data

Title

Untitled (matador standing by wall of bullring)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.510.1

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Person 98.1
Human 98.1
Person 94.5
Person 94.3
Apparel 93.6
Clothing 93.6
Person 93.3
Person 91.3
Person 88.5
Person 87.6
Person 86.5
Person 83.5
Person 80.2
Person 77.1
Person 74.6
Outdoors 71
People 68.5
Coat 65.1
Airplane 63.3
Transportation 63.3
Vehicle 63.3
Aircraft 63.3
Meal 62.8
Food 62.8
Nature 59.5
Shop 59.5
Pants 55.7
Person 53.8
Person 47.5
Person 43.1

Clarifai
created on 2019-08-09

people 99.9
group together 99.6
many 99
group 98.6
adult 98.2
vehicle 97.7
man 96.4
wear 94.7
war 94
military 93.2
aircraft 92.1
crowd 91.5
monochrome 91.2
administration 90.5
transportation system 90.5
watercraft 89
several 87.4
one 85.6
skirmish 85.5
street 85.1

Imagga
created on 2019-08-09

shoe shop 75.3
shop 70.6
mercantile establishment 52.4
place of business 34.9
establishment 17.4
city 15.8
metal 12.9
black 12.6
old 11.8
closeup 11.4
travel 11.3
industry 11.1
work 11
steel 10.6
art 10.4
industrial 10
structure 9.9
barbershop 9.9
urban 9.6
pattern 9.6
modern 9.1
business 9.1
texture 9
transportation 9
material 8.9
building 8.9
architecture 8.6
grunge 8.5
house 8.4
wall 8
design 7.9
colors 7.9
textured 7.9
sea 7.8
iron 7.7
window 7.4
part 7.4
retro 7.4
street 7.4
light 7.3
transport 7.3
rough 7.3
dirty 7.2
working 7.1
surface 7.1

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

text 91.3
indoor 89.2
person 88.3
clothing 85.8
black and white 85.7
ship 53.7
gun 21.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-57
Gender Female, 52.1%
Happy 45.2%
Sad 51.8%
Angry 45.4%
Fear 45.1%
Confused 45.3%
Calm 47%
Disgusted 45.1%
Surprised 45.1%

AWS Rekognition

Age 35-51
Gender Male, 51.5%
Fear 45%
Angry 45.1%
Happy 47.8%
Disgusted 45%
Sad 47.1%
Calm 49.8%
Surprised 45.1%
Confused 45.1%

AWS Rekognition

Age 28-44
Gender Male, 54.5%
Angry 46.2%
Sad 45.1%
Surprised 45.2%
Fear 45%
Calm 52.6%
Disgusted 45.1%
Confused 45.1%
Happy 45.7%

AWS Rekognition

Age 48-66
Gender Female, 51.8%
Fear 45.2%
Surprised 45.1%
Disgusted 45%
Happy 46.2%
Sad 53%
Calm 45.4%
Angry 45.1%
Confused 45%

AWS Rekognition

Age 46-64
Gender Male, 53.1%
Calm 46.4%
Confused 45.2%
Sad 52.8%
Fear 45.1%
Disgusted 45%
Surprised 45.1%
Angry 45.3%
Happy 45%

AWS Rekognition

Age 40-58
Gender Male, 50%
Fear 49.7%
Confused 49.6%
Calm 49.5%
Sad 49.9%
Disgusted 49.5%
Angry 49.5%
Surprised 49.6%
Happy 49.6%

AWS Rekognition

Age 37-55
Gender Male, 50.2%
Disgusted 49.5%
Fear 49.6%
Happy 49.5%
Angry 49.6%
Confused 49.6%
Sad 49.6%
Surprised 49.6%
Calm 50%

AWS Rekognition

Age 51-69
Gender Male, 50.8%
Sad 47%
Confused 45.5%
Calm 50%
Surprised 45.4%
Angry 45.3%
Fear 45.3%
Disgusted 46.1%
Happy 45.4%

AWS Rekognition

Age 26-40
Gender Male, 50.4%
Surprised 49.6%
Confused 49.5%
Disgusted 49.5%
Calm 50%
Fear 49.7%
Angry 49.6%
Sad 49.6%
Happy 49.5%

AWS Rekognition

Age 16-28
Gender Male, 54.4%
Confused 45.1%
Calm 47.2%
Angry 45.1%
Sad 52.1%
Surprised 45.1%
Fear 45.2%
Happy 45.2%
Disgusted 45%

Feature analysis

Amazon

Person 98.1%
Airplane 63.3%

Categories

Captions

Text analysis

Google

AUPA
AUPA