Human Generated Data

Title

Erik & Dylan, May 8, 1983 (Mother's Day)

Date

1983

People

Artist: Judith Black, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.325

Human Generated Data

Title

Erik & Dylan, May 8, 1983 (Mother's Day)

People

Artist: Judith Black, American born 1945

Date

1983

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.325

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.1
Clothing 98
Apparel 98
Home Decor 97.5
Shoe 96.5
Footwear 96.5
Shoe 92.7
Pants 71.9
Vehicle 70.4
Transportation 70.4
Sleeve 67.4
Meal 65.6
Food 65.6
Window 57.9
Shorts 57
Long Sleeve 55.6
Door 55.2

Clarifai
created on 2023-10-25

people 99.9
child 99.5
street 99.4
monochrome 99.3
two 98.1
boy 97.6
man 95.6
adult 95.1
group together 94.2
woman 94.1
group 94
transportation system 93.6
offspring 93.4
vehicle 92.9
three 92.6
family 89.6
portrait 89.4
black and white 87.1
vehicle window 86.1
son 84.8

Imagga
created on 2022-01-09

musical instrument 30.7
man 28.2
accordion 21.5
people 20.6
male 20
keyboard instrument 19.4
person 17.6
industry 17.1
wind instrument 15.8
working 15
work 13.9
worker 13.5
business 13.4
adult 12.9
transportation 12.5
black 12.5
urban 12.2
equipment 12.2
device 11.8
city 11.6
building 11.4
vehicle 11.2
men 11.2
forklift 11
occupation 11
transport 11
office 10.5
construction 10.3
industrial 10
job 9.7
steel 9.7
repair 9.6
home 9.6
smiling 9.4
site 9.4
safety 9.2
happy 8.8
labor 8.8
walk 8.6
portrait 8.4
machine 8.4
house 8.4
support 8.3
human 8.2
travel 7.7
sitting 7.7
modern 7.7
outside 7.7
outdoor 7.6
life 7.5
outdoors 7.5
window 7.4
technology 7.4
street 7.4
metal 7.2
clothing 7.2
architecture 7
sky 7

Microsoft
created on 2022-01-09

clothing 95.8
person 94.2
text 92.3
black and white 91.3
standing 88.4
street 80.5
monochrome 74.8
footwear 51.6
posing 37.5
clothes 15.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-12
Gender Male, 99.6%
Calm 64.4%
Sad 34.7%
Confused 0.2%
Angry 0.2%
Fear 0.2%
Happy 0.1%
Disgusted 0.1%
Surprised 0%

AWS Rekognition

Age 31-41
Gender Female, 67.2%
Calm 48%
Confused 44.1%
Angry 2.9%
Surprised 2.3%
Sad 1.3%
Disgusted 0.7%
Happy 0.4%
Fear 0.2%

Microsoft Cognitive Services

Age 8
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 96.5%

Categories

Imagga

paintings art 99.5%