Human Generated Data

Title

Rose and Her Mother, Fall River, MA

Date

1982

People

Artist: Polly Brown, American 1941 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.233

Human Generated Data

Title

Rose and Her Mother, Fall River, MA

People

Artist: Polly Brown, American 1941 -

Date

1982

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.233

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.9
Human 99.9
Person 97.3
Shoe 97.2
Footwear 97.2
Clothing 97.2
Apparel 97.2
Interior Design 96.1
Indoors 96.1
Room 94.8
Chair 81.1
Furniture 81.1
Person 74.9
Appliance 73.4
Bedroom 71.4
Refrigerator 68.5
Living Room 66
Person 65.4
Monitor 64.8
Display 64.8
Screen 64.8
Electronics 64.8
Dorm Room 58.5
Clinic 55.7

Clarifai
created on 2023-10-25

people 99.9
room 99.1
chair 99
two 98.2
adult 97.6
man 97.5
monochrome 97.2
woman 96.9
sit 96.3
furniture 96
group 95.9
child 95.6
three 95.2
desk 95.2
group together 93.8
hospital 93.4
seat 92.4
indoors 92.4
family 92.1
medical practitioner 89.6

Imagga
created on 2022-01-08

computer 37.3
office 35.3
laptop 34.4
man 32.9
work 29.8
business 27.9
person 27.5
people 27.3
male 27
room 26.8
businessman 24.7
professional 24.3
chair 23.4
working 23
men 21.5
adult 21.5
indoors 21.1
home 18.4
teacher 18.3
corporate 18
interior 17.7
classroom 17.3
modern 16.8
furniture 16.3
technology 16.3
desk 15.9
device 15.7
handsome 15.2
worker 15.1
notebook 14.9
smiling 14.5
executive 14.2
job 14.2
communication 13.4
house 13.4
education 13
occupation 12.8
equipment 12.8
confident 12.7
businesswoman 12.7
suit 12.6
attractive 12.6
happy 12.5
table 12.5
musical instrument 12.5
meeting 12.3
sitting 12
smile 11.4
indoor 11
network 11
lifestyle 10.8
call 10.4
senior 10.3
monitor 10.2
student 10.1
seat 9.9
group 9.7
success 9.7
employee 9.6
women 9.5
floor 9.3
projector 9.3
briefcase 9
life 8.5
keyboard 8.4
portrait 8.4
pretty 8.4
school 8.4
manager 8.4
mature 8.4
machine 8.1
team 8.1
looking 8
black 7.8
portable computer 7.7
electronic instrument 7.7
hand 7.6
career 7.6
phone 7.4
inside 7.4
light 7.4
successful 7.3
design 7.3
teenager 7.3
barber chair 7.2
support 7.1
to 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

indoor 98.7
person 94.4
text 94.1
clothing 94
black and white 82.9
furniture 74.9
man 63.4
footwear 52.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 94.6%
Confused 2%
Surprised 1.4%
Sad 0.7%
Angry 0.6%
Fear 0.4%
Disgusted 0.3%
Happy 0.1%

AWS Rekognition

Age 0-3
Gender Female, 98.2%
Sad 63.7%
Calm 27.6%
Confused 3.4%
Disgusted 2.4%
Angry 1.8%
Happy 0.6%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 0-4
Gender Male, 99.7%
Calm 89.7%
Sad 5.6%
Disgusted 1.2%
Angry 1%
Surprised 0.9%
Happy 0.7%
Confused 0.6%
Fear 0.3%

AWS Rekognition

Age 0-3
Gender Female, 99.7%
Calm 97.9%
Happy 0.6%
Confused 0.4%
Sad 0.4%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 16-22
Gender Male, 63.3%
Calm 99.7%
Sad 0.1%
Angry 0.1%
Confused 0%
Surprised 0%
Happy 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 25-35
Gender Female, 57%
Surprised 43.6%
Fear 26.7%
Happy 7.7%
Angry 5.9%
Sad 5.7%
Calm 4.3%
Confused 4%
Disgusted 2.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%
Shoe 97.2%
Chair 81.1%
Refrigerator 68.5%

Categories