Human Generated Data

Title

Untitled (students examining plants with teacher inside greenhouse classroom)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9393

Human Generated Data

Title

Untitled (students examining plants with teacher inside greenhouse classroom)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9393

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Tie 99.5
Accessories 99.5
Accessory 99.5
Person 99.4
Person 99.3
Person 99.3
Person 97
Person 96.7
Person 93.8
Person 91.2
Garden 87.2
Outdoors 87.2
Plant 84.9
Worker 76.4
Gardening 73.3
Clothing 71.2
Apparel 71.2
Pottery 60.7
Photography 60.3
Photo 60.3
Gardener 59.2
Female 57.7
Potted Plant 57.5
Jar 57.5
Vase 57.5

Clarifai
created on 2023-10-26

people 99.9
group together 99.5
group 99
many 98.4
adult 98.2
woman 97.3
war 97.2
man 97
child 96.6
administration 96.2
several 93.4
leader 92.8
soldier 92
boy 91.6
home 90.9
recreation 85.3
police 81.9
military 81.4
five 79.4
offense 79.1

Imagga
created on 2022-01-23

home 27.1
happy 26.3
man 25.5
male 23.7
couple 21.8
person 20.7
people 20.1
adult 19.9
child 18.3
sitting 18
lifestyle 16.6
family 16
smiling 15.9
food 15.5
indoors 14
cheerful 13.8
relaxing 13.6
women 13.4
pretty 13.3
outdoors 12.7
attractive 12.6
house 12.5
boy 12.2
table 11.9
drink 11.7
interior 11.5
mother 11.5
friends 11.3
fun 11.2
love 11
happiness 11
relax 10.9
holiday 10.7
together 10.5
garden 10.4
restaurant 10.2
eating 10.1
meal 10.1
cute 10
wine 9.7
having 9.7
husband 9.5
day 9.4
glass 9.3
smile 9.3
relaxation 9.2
indoor 9.1
holding 9.1
portrait 9
room 9
father 9
lady 8.9
kid 8.9
flowers 8.7
enjoying 8.5
laughing 8.5
friendship 8.4
water 8
celebration 8
shop 7.9
standing 7.8
men 7.7
outside 7.7
flower 7.7
outdoor 7.6
fine 7.6
gardening 7.6
kids 7.5
plant 7.5
building 7.5
joy 7.5
leisure 7.5
20s 7.3
children 7.3
summer 7.1
daughter 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

window 87.5
black and white 81.4
table 68.5
house 67.2
clothing 62.8
person 56.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 83.1%
Calm 89.8%
Sad 7.2%
Happy 0.9%
Angry 0.6%
Confused 0.5%
Disgusted 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Female, 81%
Calm 97.3%
Happy 1.1%
Sad 1%
Surprised 0.2%
Fear 0.2%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 36-44
Gender Female, 51.1%
Calm 98.6%
Sad 0.6%
Happy 0.2%
Confused 0.2%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99.6%
Calm 99.1%
Happy 0.5%
Sad 0.2%
Confused 0.1%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Female, 99.7%
Calm 93.4%
Sad 3.8%
Happy 1.1%
Confused 0.5%
Angry 0.4%
Disgusted 0.3%
Surprised 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Tie 99.5%

Text analysis

Amazon

KODOK-EVEELA