Human Generated Data

Title

Untitled (family, missing child)

Date

c.1970, from 1960 negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18755

Human Generated Data

Title

Untitled (family, missing child)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, from 1960 negative

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18755

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Person 99.5
Person 99.1
Clothing 99.1
Apparel 99.1
Chair 98.9
Furniture 98.9
Person 97.4
Person 97.4
Person 97.3
Person 96.7
Person 96.5
Person 96.3
Shorts 92.2
Person 91.5
Person 91.1
Housing 88.4
Building 88.4
Shoe 85.1
Footwear 85.1
Face 80.9
Outdoors 74.6
Female 73.7
People 73.7
Leisure Activities 72.6
Sitting 69.7
Kid 69
Child 69
Suit 67.5
Coat 67.5
Overcoat 67.5
Shoe 66.9
Sunglasses 66.6
Accessories 66.6
Accessory 66.6
Girl 65.4
Dress 65.3
Grass 64.5
Plant 64.5
Portrait 62.6
Photography 62.6
Photo 62.6
Brick 61.5
House 60.6
Yard 59.1
Nature 59.1
Crowd 58.5
Musical Instrument 56.7
Smile 56.6
Shoe 51
Person 41.7

Clarifai
created on 2023-10-22

people 100
child 99.4
group 99.3
group together 99.3
many 99
adult 96.7
woman 96.2
education 96
school 95.5
several 95.1
man 94.4
boy 93.5
adolescent 92.9
recreation 92.7
five 88.3
wear 87.3
family 85.9
outfit 84.4
four 83.6
elementary school 78.5

Imagga
created on 2022-03-05

brass 70
wind instrument 54.1
trombone 45.7
musical instrument 38.2
male 26.9
people 26.8
man 26.2
silhouette 24
sport 21.3
adult 18.2
person 18
group 16.9
outdoors 16.4
men 16.3
athlete 14.8
competition 14.6
sunset 14.4
active 14
boy 13
child 12.6
field 12.5
beach 11.8
black 11.4
couple 11.3
team 10.7
family 10.7
fun 10.5
grass 10.3
sky 10.2
outdoor 9.9
player 9.7
women 9.5
play 9.5
day 9.4
happy 9.4
ball 9.4
water 9.3
business 9.1
exercise 9.1
summer 9
together 8.8
run 8.7
lifestyle 8.7
youth 8.5
kin 8.5
children 8.2
vacation 8.2
businessman 7.9
baritone 7.9
goal 7.7
football 7.7
crowd 7.7
running 7.7
device 7.6
dusk 7.6
runner 7.6
kids 7.5
friends 7.5
cornet 7.5
friendship 7.5
ocean 7.5
action 7.4
teenager 7.3
girls 7.3
bugle 7.2
recreation 7.2
trainer 7.1
kid 7.1
happiness 7
spectator 7
travel 7

Google
created on 2022-03-05

Standing 86.4
Window 84.3
Building 84.1
Style 84
Black-and-white 83.7
Dress 83.6
Adaptation 79.2
Art 76.9
Monochrome 76.3
Monochrome photography 75.1
Vintage clothing 71.1
Event 67.8
Room 67.5
Child 66.6
Visual arts 65.9
History 65.7
Sitting 64.7
Chair 63.9
Team 63.6
Illustration 59.7

Microsoft
created on 2022-03-05

person 98.2
clothing 95.4
footwear 85.8
outdoor 85.4
text 77.4
drawing 66.7
man 63.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 97.2%
Calm 66.4%
Disgusted 15.8%
Sad 9.3%
Angry 2.2%
Happy 2.1%
Fear 1.8%
Confused 1.3%
Surprised 1.1%

AWS Rekognition

Age 35-43
Gender Male, 98.7%
Calm 96.9%
Sad 2.2%
Happy 0.3%
Surprised 0.2%
Confused 0.2%
Fear 0.1%
Disgusted 0.1%
Angry 0%

AWS Rekognition

Age 30-40
Gender Male, 100%
Surprised 96.6%
Calm 2.5%
Disgusted 0.3%
Angry 0.2%
Confused 0.2%
Sad 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 45-51
Gender Male, 99.6%
Calm 85.9%
Sad 9.8%
Confused 2.4%
Surprised 0.8%
Disgusted 0.4%
Happy 0.4%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 16-24
Gender Male, 99.2%
Calm 93%
Sad 3.5%
Disgusted 0.8%
Angry 0.7%
Confused 0.6%
Surprised 0.6%
Happy 0.6%
Fear 0.2%

AWS Rekognition

Age 10-18
Gender Female, 56.7%
Calm 99.7%
Surprised 0.1%
Sad 0%
Happy 0%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Female, 75.1%
Happy 50.9%
Calm 38%
Surprised 6.5%
Fear 1.2%
Disgusted 1.1%
Angry 1%
Sad 0.8%
Confused 0.6%

AWS Rekognition

Age 40-48
Gender Male, 100%
Surprised 64.6%
Calm 30.6%
Fear 1.5%
Sad 1.2%
Happy 1%
Disgusted 0.5%
Confused 0.3%
Angry 0.3%

AWS Rekognition

Age 40-48
Gender Female, 95%
Calm 67.7%
Happy 25.2%
Sad 4.5%
Surprised 0.7%
Disgusted 0.7%
Confused 0.5%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 27-37
Gender Male, 98.3%
Calm 95.8%
Surprised 2.8%
Sad 0.5%
Angry 0.4%
Disgusted 0.3%
Confused 0.1%
Happy 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Sunglasses
Person 99.5%
Person 99.5%
Person 99.1%
Person 97.4%
Person 97.4%
Person 97.3%
Person 96.7%
Person 96.5%
Person 96.3%
Person 91.5%
Person 91.1%
Person 41.7%
Shoe 85.1%
Shoe 66.9%
Shoe 51%
Sunglasses 66.6%

Categories

Captions

Clarifai
created by general-english-image-caption-blip on 2025-05-04

a photograph of a group of people sitting on chairs in a room -100%

Google Gemini

Created by gemini-2.0-flash-lite on 2025-05-04

Here's a description of the image:

Overall:

The image is a black and white negative of a group portrait. It shows a family, possibly a mother and several children, posing indoors near a window. The setting appears to be a rustic, wooden-walled room. The photograph has a vintage feel.

Details:

  • People: There are 12 individuals. It appears to be a family, with a mother holding a baby and several other children of varying ages. An elderly man is holding a toddler.
  • Attire: The clothing seems simple and informal, suggesting a time when dressing up wasn't as common.
  • Setting: The group is situated in front of a wood-paneled wall. The room is sparsely furnished, with the individuals mostly sitting on small stools.
  • Mood/Feel: The photograph evokes a sense of the past. The poses and attire create a historical ambiance. The expressions on the faces suggest a mixture of seriousness and perhaps a hint of shyness.

Created by gemini-2.0-flash on 2025-05-04

Here is a description of the image:

The image shows a group of people, predominantly children, posed for a photo in front of what appears to be a wooden or log structure. The image appears to be a photographic negative, as the tones are inverted.

On the left side of the frame, there are children sitting on what appear to be small chairs or stools. A woman is seated centrally, holding what seems to be an infant or a swaddled baby. There is a man standing behind the group holding a toddler. A window is placed behind the man. The overall composition suggests a formal or posed family portrait. The setting looks to be a room inside a building with rough-hewn walls, possibly a home.

Text analysis

Amazon

4
INFORMATION 4
INFORMATION