Human Generated Data

Title

Untitled (photograph of large family group seated on front porch among pots of flowers)

Date

c. 1930, printed later

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13165

Human Generated Data

Title

Untitled (photograph of large family group seated on front porch among pots of flowers)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13165

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.2
Human 99.2
Person 99.2
Person 99.1
People 98.1
Person 97.6
Person 97
Family 95.9
Person 91.7
Person 91.6
Person 84.1
Person 78.4
Plant 69.4
Person 67.5
Person 66
Flower 65.7
Blossom 65.7
Flower Bouquet 59.7
Flower Arrangement 59.7
Clothing 58.5
Apparel 58.5

Clarifai
created on 2023-10-26

people 100
group 99.9
many 99.4
group together 98.9
adult 98.4
child 98.1
woman 96.6
leader 96.4
administration 96.2
man 95.9
furniture 94.7
several 94.6
education 93.1
boy 91.7
school 89.9
war 89.4
teacher 89.3
outfit 87.9
sit 87.6
nostalgia 87.3

Imagga
created on 2022-01-22

man 34.2
brass 32.7
musical instrument 31.4
male 29.8
people 28.4
wind instrument 27.6
couple 27
person 26.4
adult 24.5
businessman 23.8
office 23.3
television 21.9
happy 21.9
business 21.8
smiling 21.7
home 17.5
sitting 17.2
kin 16.8
room 16.1
men 15.4
indoors 14.9
together 14.9
teacher 14.9
group 14.5
computer 13.7
telecommunication system 13.6
smile 13.5
desk 13.2
portrait 12.3
professional 12.2
senior 12.2
classroom 12
stringed instrument 11.9
indoor 11.9
day 11.8
happiness 11.7
lifestyle 11.5
businesspeople 11.4
cheerful 11.4
boy 11.3
table 11.2
work 11
cornet 10.9
modern 10.5
executive 10.5
chair 10.5
education 10.4
student 10.3
mature 10.2
laptop 10.2
two 10.2
working 9.7
class 9.6
looking 9.6
child 9.5
women 9.5
violin 9.5
meeting 9.4
groom 9.2
businesswoman 9.1
team 8.9
family 8.9
to 8.8
teaching 8.8
love 8.7
school 8.6
corporate 8.6
attractive 8.4
drink 8.3
bowed stringed instrument 8.2
worker 8.1
romance 8
handsome 8
job 8
interior 8
monitor 7.8
face 7.8
bride 7.8
studying 7.7
boss 7.6
husband 7.6
talking 7.6
communication 7.5
house 7.5
learning 7.5
fun 7.5
study 7.4
camera 7.4
mother 7.4
wedding 7.3
alone 7.3
clothing 7.3
children 7.3
black 7.2
romantic 7.1
idea 7.1

Google
created on 2022-01-22

Picture frame 92.8
Dress 84.9
Rectangle 83.5
People 79.4
Suit 77.4
Plant 76.7
Vintage clothing 74
Room 71.8
Event 71.1
Formal wear 67
Classic 65.4
Art 65.3
Stock photography 65.2
History 62.4
Oval 57.7
Visual arts 57.6
Interior design 53.8
Retro style 53.2
Font 52.2
Family 51.6

Microsoft
created on 2022-01-22

text 99.6
posing 99.2
window 97.9
person 97.3
clothing 96.3
wedding dress 92.2
bride 88.7
old 88.6
woman 85.3
flower 83.3
dress 81.3
smile 75.6
electronics 73.4
group 72.9
black 67.6
man 64.5
wedding 51.5
image 47.3
display 38.2
picture frame 27.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Female, 100%
Happy 94.9%
Calm 1.8%
Angry 0.7%
Sad 0.6%
Disgusted 0.6%
Surprised 0.5%
Confused 0.5%
Fear 0.4%

AWS Rekognition

Age 29-39
Gender Male, 98.1%
Happy 85.4%
Calm 11%
Surprised 1.2%
Angry 0.6%
Sad 0.5%
Disgusted 0.5%
Confused 0.4%
Fear 0.3%

AWS Rekognition

Age 30-40
Gender Female, 100%
Calm 76.8%
Confused 12.2%
Sad 3.3%
Angry 2.6%
Disgusted 2.6%
Surprised 1%
Happy 0.7%
Fear 0.7%

AWS Rekognition

Age 33-41
Gender Female, 100%
Happy 32.5%
Calm 26.2%
Confused 9.7%
Fear 9.1%
Disgusted 8.9%
Sad 5.6%
Angry 4.2%
Surprised 3.8%

AWS Rekognition

Age 48-56
Gender Female, 85.8%
Calm 99%
Confused 0.2%
Angry 0.2%
Surprised 0.2%
Sad 0.1%
Fear 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 39-47
Gender Female, 99.9%
Calm 91.5%
Angry 5.9%
Sad 1.4%
Confused 0.9%
Surprised 0.1%
Disgusted 0.1%
Fear 0%
Happy 0%

AWS Rekognition

Age 0-4
Gender Female, 56.6%
Happy 62.7%
Calm 14.8%
Angry 11.8%
Confused 4.9%
Surprised 2.6%
Sad 1.4%
Disgusted 1.1%
Fear 0.8%

AWS Rekognition

Age 48-56
Gender Male, 100%
Calm 98.5%
Confused 0.7%
Sad 0.2%
Angry 0.2%
Surprised 0.2%
Happy 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 10-18
Gender Female, 88%
Angry 60.9%
Sad 21.5%
Calm 13.9%
Confused 1.1%
Fear 1%
Surprised 0.6%
Disgusted 0.6%
Happy 0.3%

AWS Rekognition

Age 49-57
Gender Male, 99.7%
Happy 80.4%
Sad 8.4%
Calm 5.4%
Angry 2.5%
Disgusted 1.2%
Surprised 0.9%
Fear 0.8%
Confused 0.4%

AWS Rekognition

Age 35-43
Gender Female, 88.9%
Happy 99.5%
Confused 0.1%
Calm 0.1%
Surprised 0.1%
Fear 0.1%
Sad 0.1%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 45-51
Gender Male, 99.3%
Calm 100%
Angry 0%
Sad 0%
Confused 0%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 52-60
Gender Male, 100%
Calm 98.8%
Confused 0.5%
Sad 0.3%
Angry 0.1%
Surprised 0.1%
Happy 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 91.5%
Angry 39.2%
Calm 33%
Confused 13.5%
Sad 12.3%
Disgusted 1%
Surprised 0.7%
Fear 0.3%
Happy 0.2%

Microsoft Cognitive Services

Age 58
Gender Male

Microsoft Cognitive Services

Age 24
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories