Human Generated Data

Title

Untitled (Ben Shahn taking a photograph, Asia)

Date

January 14, 1960-April 22, 1960

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1998.139

Human Generated Data

Title

Untitled (Ben Shahn taking a photograph, Asia)

People

Artist: Unidentified Artist,

Date

January 14, 1960-April 22, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1998.139

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.5
Human 99.5
Person 98.4
Person 97.6
Person 97.1
Person 96
Clothing 80.2
Apparel 80.2
Train 69.1
Transportation 69.1
Vehicle 69.1
Person 60.8
Helmet 57.5
Restaurant 56.7

Clarifai
created on 2023-10-25

people 99.6
one 97.8
two 97.6
man 97.1
adult 96.5
portrait 95.5
music 93.9
wear 93.5
three 92.2
woman 90.9
war 89.5
retro 88.6
military 84.4
group 81.9
child 81.8
group together 81.2
veil 80.8
indoors 80.5
outfit 77.3
four 75.9

Imagga
created on 2021-12-15

man 28.2
people 22.3
person 20.9
musical instrument 20.7
device 19.4
vehicle 19.1
adult 17.5
male 15.6
car 15.3
accordion 14.8
washboard 14.5
sitting 13.7
business 13.4
men 12.9
work 12.8
wind instrument 12.6
transportation 12.5
inside 11.9
keyboard instrument 11.4
passenger 11.4
shop 11
driver 10.7
working 10.6
one 10.4
black 10.2
call 10.2
equipment 10.1
office 9.8
attractive 9.8
job 9.7
couple 9.6
auto 9.6
industry 9.4
face 9.2
fashion 9
happy 8.8
automobile 8.6
drive 8.5
technology 8.2
computer 8
looking 8
lifestyle 7.9
portrait 7.8
pretty 7.7
communication 7.6
telephone 7.5
human 7.5
music 7.5
outdoors 7.5
holding 7.4
phone 7.4
safety 7.4
transport 7.3
smiling 7.2
worker 7.2
smile 7.1
posing 7.1
interior 7.1
room 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

clothing 89.5
text 88.3
person 85.3
man 56.5
black and white 50.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-58
Gender Female, 73.4%
Happy 97.8%
Calm 0.6%
Surprised 0.3%
Sad 0.3%
Disgusted 0.3%
Fear 0.3%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 34-50
Gender Female, 66.6%
Happy 41.5%
Calm 28.3%
Sad 17.1%
Fear 5.5%
Angry 4%
Confused 1.5%
Surprised 1.4%
Disgusted 0.7%

AWS Rekognition

Age 37-55
Gender Female, 50.6%
Angry 28.7%
Calm 21.3%
Fear 16.9%
Sad 16.4%
Happy 6.8%
Disgusted 4.4%
Surprised 2.9%
Confused 2.4%

AWS Rekognition

Age 33-49
Gender Female, 74.3%
Fear 75.1%
Sad 7.8%
Calm 7.3%
Happy 3.6%
Surprised 3.1%
Confused 1.4%
Angry 1.3%
Disgusted 0.5%

AWS Rekognition

Age 12-22
Gender Female, 86.7%
Calm 90.8%
Angry 5.2%
Sad 1.3%
Disgusted 0.9%
Surprised 0.7%
Happy 0.5%
Confused 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.5%
Train 69.1%

Categories