Human Generated Data

Title

Untitled (man and older woman holding framed family portraits)

Date

1962-1971, printed later

People

Artist: Milton Rogovin, American 1909 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Jon Vein and Ellen Goldsmith-Vein, 2011.605

Human Generated Data

Title

Untitled (man and older woman holding framed family portraits)

People

Artist: Milton Rogovin, American 1909 - 2011

Date

1962-1971, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Jon Vein and Ellen Goldsmith-Vein, 2011.605

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99.6
Human 99.6
Person 98.9
Musician 98.2
Musical Instrument 98.2
Clothing 97.3
Apparel 97.3
Person 86.2
Percussion 76.5
Drum 76.5
Sleeve 73.5
Skin 68.7
Face 63.9
Photography 63.5
Portrait 63.5
Photo 63.5
Gong 60.9
Music Band 59.6
Drummer 57.8
Long Sleeve 57.1
Man 56.9

Clarifai
created on 2018-03-23

people 100
group 99.4
adult 99.1
portrait 98.1
two 98.1
man 97.8
five 97.6
three 97.5
music 97.4
group together 96.3
four 96.2
musician 96.1
leader 95.8
facial expression 95.6
actress 94.7
several 94.5
woman 94
percussion instrument 93.9
administration 92.8
singer 92.1

Imagga
created on 2018-03-23

brass 44
musical instrument 43.5
wind instrument 32.5
banjo 21.7
stringed instrument 20.9
art 18
person 15
crown 14.4
religion 14.3
man 14.1
face 13.5
god 12.4
religious 12.2
money 11.9
business 11.5
statue 11.5
black 11.5
gold 11.5
sculpture 11.4
dollar 11.1
crown jewels 10.9
currency 10.8
adult 10.6
gong 10.6
technology 10.4
cash 10.1
old 9.7
close 9.7
pray 9.7
design 9.6
spirituality 9.6
singer 9.6
temple 9.5
color 9.4
architecture 9.4
finance 9.3
male 9.2
people 8.9
financial 8.9
pattern 8.9
concepts 8.9
ancient 8.6
percussion instrument 8.6
faith 8.6
musician 8.5
culture 8.5
costume 8.5
travel 8.4
symbol 8.1
decoration 8
conceptual 7.9
performer 7.9
paper 7.8
antique 7.8
golden 7.7
worship 7.7
spiritual 7.7
monument 7.5
one 7.5
tourism 7.4
banking 7.3

Microsoft
created on 2018-03-23

person 99.7
posing 98.2
man 91.4
outdoor 89.8
old 86.1
standing 83.4
white 72
people 65.6
vintage 47.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-68
Gender Male, 98.6%
Disgusted 2.1%
Confused 1.8%
Angry 2.8%
Surprised 3.5%
Happy 87.9%
Sad 1.2%
Calm 0.8%

AWS Rekognition

Age 57-77
Gender Male, 92.4%
Calm 33.6%
Sad 29.9%
Confused 4.9%
Happy 1.8%
Disgusted 12%
Angry 14.3%
Surprised 3.5%

AWS Rekognition

Age 23-38
Gender Male, 54.9%
Disgusted 45.1%
Confused 45.3%
Angry 45.2%
Surprised 45.2%
Calm 47.8%
Happy 47.8%
Sad 48.6%

AWS Rekognition

Age 35-52
Gender Male, 50.4%
Disgusted 49.5%
Happy 49.5%
Surprised 49.5%
Sad 49.7%
Calm 50.2%
Angry 49.6%
Confused 49.5%

AWS Rekognition

Age 35-53
Gender Female, 50.4%
Sad 49.8%
Surprised 49.5%
Confused 49.5%
Happy 50.1%
Calm 49.5%
Disgusted 49.5%
Angry 49.5%

AWS Rekognition

Age 19-36
Gender Male, 50.1%
Calm 50.2%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Confused 49.6%
Sad 49.6%
Happy 49.5%

AWS Rekognition

Age 48-68
Gender Male, 50.5%
Disgusted 50%
Happy 49.5%
Angry 49.6%
Surprised 49.5%
Sad 49.8%
Calm 49.6%
Confused 49.5%

AWS Rekognition

Age 38-57
Gender Male, 50.4%
Confused 49.6%
Surprised 49.6%
Happy 49.6%
Angry 49.8%
Disgusted 49.8%
Sad 49.6%
Calm 49.6%

AWS Rekognition

Age 35-53
Gender Female, 50.1%
Disgusted 49.5%
Surprised 49.5%
Sad 50.2%
Calm 49.6%
Angry 49.5%
Happy 49.5%
Confused 49.6%

Microsoft Cognitive Services

Age 40
Gender Male

Microsoft Cognitive Services

Age 72
Gender Male

Microsoft Cognitive Services

Age 36
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%
Person 98.9%
Person 86.2%

Categories

Imagga

paintings art 98.4%
people portraits 1.6%

Captions

Anthropic Claude

Created by claude-3-haiku-20240307 on 2024-12-30

The image shows two older individuals, a man and a woman, holding framed photographs. The photographs appear to be family portraits. The man is holding an image of a bride and groom, while the woman is holding an image of a larger family group. The background of the image suggests it is taken in a home setting, with a floral arrangement and other personal items visible. The expressions on the faces of the individuals suggest they are proud or sentimental while displaying these cherished photographs.

Created by claude-3-5-sonnet-20241022 on 2024-12-30

This is a black and white photograph showing two people standing side by side, displaying ornate oval picture frames containing family photographs. On the left is someone wearing a white t-shirt and dark pants, holding a framed photo that appears to be a wedding portrait. On the right is an elderly person in a patterned dress holding another framed photograph that seems to show a family group. In the background, there appears to be another framed photograph on the wall and what looks like a vase with flowers. The photograph has a documentary style typical of mid-20th century portraiture, capturing what appears to be a moment of family history being shared across generations.